aa

I think you guys are over-simplifying the situation here. My point is that it's not as if one technology or rendering method (i.e. deferred shading) has trashed something that was otherwise going to be working perfectly. It's more that more complicated lighting/shading functions are harder to antialias. Back when the only significant source of aliasing was geometric (rasterization) and texture (minification), it was fairly straightforward to solve it. Now with the power to write any arbitrarily high frequency function in a shader, *any* term can be a potential source of aliasing.

So just to direct your righteous anger at some other real sources recently that cause aliasing all over: normal maps and per-pixel displacement maps. Both are difficult/expensive to antialias and thus almost all games just ignore the problem right now, resulting in flickering specular highlights and aliased edges in displacement-mapped surfaces. Neither of these have anything to do with deferred rendering (which is still not that common, although its use is increasing for many good reasons), but cause a lot of the aliasing that I believe you guys are referring to.

Now of course you always want "maximum IQ and speed", but it's way more complicated than that in practice. That shader that I gave you above is gonna give you both of those, so enjoy playing your game with a completely black screen ;) Clearly we're all striving for maximum IQ and speed, but it's a direct trade-off... if you lower the settings of your game to the point that it's running at 400fps then sure it's easy enough to deliver 4x SSAA, but let's keep things in perspective - even 4x SSAA'd half life 1 with all the fancy control panel settings that you can throw at it still doesn't look very good ;)

Agreed 100%; deferred rendering/shading whatever the hell shouldn't have been used until they had it working perfectly with any aa mode one could possibly want.
It does work just fine with any AA "mode" - the math is perfectly well-defined. People just haven't bothered to implement it since it's slightly more complicated and potentially more performance-intensive than with forward rendering. It'll happen though... again I reference Killzone 2 as an example of DR + AA that works just fine.

And btw, "any AA mode that one could possible want" is a bit too vague... I can define any number of "AA modes" that don't "work perfectly" with any application or renderer that you want to give as an example, including the precious 3dfx cards. In fact as far as AA options are concerned, they are positively ancient compared to modern day offerings, but that should be obvious.

Oh but you guys in the know have it under control... NOT! the "suits" have you guys under control,, but we all fight the good fight
Lol wow. Particularly funny since you're talking to a guy who has spent a ridiculous amount of time "fighting the good fight" to spend more cycles on AA, especially for proper shadow AA/filtering :D
 
Last edited by a moderator:
Why AA a POS at current res?
Textures are in a far worse condition than AA is at the moment.
 
It does work just fine with any AA "mode" - the math is perfectly well-defined. People just haven't bothered to implement it since it's slightly more complicated and potentially more performance-intensive than with forward rendering. It'll happen though... again I reference Killzone 2 as an example of DR + AA that works just fine.
Well, then why doesn't nvidia just sell drivers that have been hacked to work perfectly with aa? I know it's complicated like you say, but if people paid them to do that, then they could generate some extra revenue.

And while it may be particularly performance-intensive, it should be up to every end-user to decide whether or not performance is good enough.

I honestly don't get why nvidia/ati has to decide for me or anyone else whether a certain frame rate is good enough. The end-user should make the decision, not nvidia/ati, especially when people are willing to pay them extra for "hard work and responsibility."
 
Well, then why doesn't nvidia just sell drivers that have been hacked to work perfectly with aa? I know it's complicated like you say, but if people paid them to do that, then they could generate some extra revenue.
The game developer has to do it in this case. It's them you should be paying :)

And while it may be particularly performance-intensive, it should be up to every end-user to decide whether or not performance is good enough.
Agreed, although there's clearly an extreme case to that logic. Crysis for instance has already been criticized to death about no one being able to run it with max settings, even though their argument has always been that those settings are forward looking. Similarly if a game would run at 1fps with top end hardware with some option on, do you really want the devs to spend extra time on that option so that 10 years from now you can turn it on? :)

Again though, this is up to the games now, not NVIDIA/ATI. That's all I've been trying to say all along - the burden of implementing rendering has shifted from the HW/drivers to the games themselves, and its there that you should be comparing/petitioning/voting with your $ for the features that you care about.
 
I'll leave it at this:

Microsoft should've started requiring 100% compatible HW AA that aa'd everything (all textures, lighting, and geometry) starting with DX9. They could've worked it out even if they had to make other changes in the spec to accomadate aa for every format and application.

Like I said, microsoft either regulates something they shouldn't, or doesn't regulate something they should've.
 
if it didn't happen it's a pretty good guess no one actually making GPU's wants it.

A normal game has a sales span of a few months before ending up in the bargain bin. Only the well endowed developers would spend some resources on supporting a game that sells for 5 bucks. (this doesn't go for on-line games.)

The developers try to get the maximum out of their available hardware. This, most of the time, is mid to high-range consumer cards and some "to be released" cards. How far can and need they look into the future while developing their product? the timespan is multiple years already, how are they going to incorporate graphic advances when no money is to be made a couple of months down the road.

In other words, who is going to pay them for delivering features which will not bring in any revenue?
 
if it didn't happen it's a pretty good guess no one actually making GPU's wants it.
I kind of don't care what the makers of the GPU want, b/c I don't get why they should dictate what every player has to put up with.

Like I said, there's no reason, when MS designed dx9, for not working in 100% compatible, problem-free, great-looking HW AA. None.
 
I kind of don't care what the makers of the GPU want, b/c I don't get why they should dictate what every player has to put up with.

Like I said, there's no reason, when MS designed dx9, for not working in 100% compatible, problem-free, great-looking HW AA. None.

Sure there is -- feasibility.

There's no reason why the department of transportation can't come out and make it mandatory that all passenger vehicles MUST obtain 60mpg in city driving conditions. None.

That doesn't mean it would be reasonable or attainable. Maybe you should join the rest of us out here in what I like to call "the real world". After you've made that journey, you might understand why your statement has about as much merit as my department of transportation anecdote.
 
I kind of don't care what the makers of the GPU want, b/c I don't get why they should dictate what every player has to put up with.

Like I said, there's no reason, when MS designed dx9, for not working in 100% compatible, problem-free, great-looking HW AA. None.

You are really trying to just ignore every technical argument anyone throws at you right?
 
Probably. But maybe they could've worked on HW AA first at the sacrifice of something else?

You keep saying "hardware AA" as if this is some sort of hardware problem. There is no hardware problem, it's a software problem. The developers have chosen not to implement antialiasing, and it is their choice to make. The hardware is entirely capable of running antialiased shaders if the developer so-chose to write one, but they didn't.

End of discussion.
 
You keep saying "hardware AA" as if this is some sort of hardware problem. There is no hardware problem, it's a software problem.
Yes, PRECISELY. I've been trying to explain the reasons why developers tend to not implement AA in more complicated situations (i.e. it's harder to do and a lot slower), but fundamentally Microsoft can't mandate that "everything works with HW AA". It's maybe hard to explain without getting too much into the math/logic, but that mandate doesn't make any sense.

And once you start to try and think about what else to mandate it gets even sillier... the only reasonable thing in the end that one could enforce was something super-sampling based, and even then it's easy enough to write a function with infinite frequencies at pixel borders, rendering the super-sampling completely useless.

We're into the space where image quality is more qualitative, so while I agree that a lot more time and resources should be paid to general AA (NOT JUST whatever you're calling "hardware AA") and filtering in general, it's 100% a per-game, software problem now.

You just have to deal with the fact that there is no "big hammer" that anyone can apply... you'll just have to buy or not buy games on a case by case basis. Like I said, you still have the consumer choice, you just have it at a finer granularity now.

Like I said, there's no reason, when MS designed dx9, for not working in 100% compatible, problem-free, great-looking HW AA. None.
And that's where you just overstep your bounds again. You've demonstrated that you don't have the technical background to back up that assertion (and even the assertion itself is rather vague... I could easily argue that they did work in 100% compatible, problem-free AA to the API), so just don't make it, particularly when you've also chosen to ignore the technical information presented to you in this thread.

And get over DX9 already. DX10 >> DX9 :)
 
Last edited by a moderator:
You just have to deal with the fact that there is no "big hammer" that anyone can apply... you'll just have to buy or not buy games on a case by case basis. Like I said, you still have the consumer choice, you just have it at a finer granularity now.
Actually that IS the big hammer anyone can apply to the problem. ;)

Vote with your wallet.
 
In older games with no off screen render targets storing much more than just the final color data, no per pixel displacement mapping pixel shaders that generate the geometry on pixel basis (not sub pixel basis) and no shaders that do animation and physics instead of graphics rendering the simple unified (MSAA) hardware antialiasing was a much easier thing to achieve. Current and future games render (or should I say generate) the final screen pixels in so many possible ways that it's basically impossible to implement any kind of antialiasing that just works without any developer interaction.

In the future, the hardware MSAA does less and less, as more and more of the aliasing is not on the polygon edges. All pixel shader displacement mapping techniques cause aliasing inside the polygon surfaces, and hardware MSAA affects only polygon edges. There is no simple way to antialias pixel shader effects like this. SSAA works on most cases, but 4xSSAA costs around 75% of your performance (even 4 x SLI would not be enough to get same performance as without AA) and is not a magic bullet either. The driver just cannot just upscale the buffers without developer control, as many post process effects depending on exact texel sampling positions break down (half texel shifted 4x4 bilinear fetch is very commonly used in blur filters for example).

Many new games do not support antialiasing, because good quality AA would be too slow and memory intensive to implement compared to the image quality improvements that could be made with the same hardware resources put into other use. These limitations affect mostly developers that have chosen to use rendering techniques that allow them to get better looking dynamic lighting, shadowing and post process effects. Antialiasing is easier (and much more performance efficient) to implement for games that use older rendering techniques compatible with hardware MSAA.
 
Last edited by a moderator:
After reading through this thread (and especially other, less technical, forums), it's interesting to see how many misconceptions still exist regarding why AA won't work with this game or that. I think I have a pretty good handle on AA in general, but since I haven't messed with multisampled surfaces at all in D3D, there are a few things I'd like to clear up.

It's my understanding that the main reasons MSAA doesn't work in various games are:
a) Deferred rendering - AA resolve doesn't produce correct results for non-color data.
b) Multiple render targets (MRT) - MRTs don't support multisampling (MS) in DX9(?).
c) Render-to-texture - Texture surfaces cannot be MS'd (but that doesn't prevent you from rendering to a MS'd surface and then copying into a texture, right?).
d) FP16 render target - Hardware limitation on pre-R520/G80 GPUs.

It's also my understanding that at least a and b (and obviously d) are no longer limitations in DX10 (though MS'd deferred rendering requires developer support). There are probably a few cases that I missed as well, such as whatever prevents UE3 from properly supporting MSAA despite the fact only shadow (map?) rendering is deferred. This is compounded by the fact that despite reports that only Nvidia cards support AA in UT3, if I force AA in the CP for my X1900, I definitely see AA on *most* edges, albeit with a major performance hit. As you can see by all the question marks, there are still plenty of things I'm unsure of.

So I understand why most games that don't support MSAA don't support it. However, I was recently reminded that Halo, one of the earlier DX9 titles, doesn't support MSAA either, and I'm not exactly sure why. In the Bungie's technical FAQ, they mention something about using an off-screen render target (RT) for the main shading pass so they can later apply some post-processing effects for the final render. My question is whether there is a technical reason they *can't* support MSAA (MRT maybe?), or if it's just that they don't code for MSAA for their off-screen RT and forcing it in the driver won't work because that only affects the backbuffer. Even that I'm not sure of; when you force AA in the driver does it try to MS all supported surfaces created with a RT flag, or just the backbuffer?
 
From the PC technical faq, it appears they are using MRTs.

Oh yes, I can't believe I missed the the part in the FAQ where it says it renders to "multiple off-screen surfaces". I must have blocked that part out and just focused on the fact that they don't render directly into the backbuffer. :oops:
 
Back
Top