aa

Why don't games using the GRAW engine allow aa?

Could nvidia make a driver to allow any aa mode to be forced from the control panel?

Also, what HW/rendering technique/game engine limitation prevents control panel aa from working?
 
defered rendering is the culprit
im sure someone will give you a more in depth explaination
ps: a later patch does allow edge smoothing in graw
 
Yup, deferred rendering. It's one of many, many interesting and useful rendering styles that is incompatible with the driver trying to screw with your buffers and settings behind your back. You can tell that I'm not a fan of control-panel-forced stuff (except for super-old games that predate the API flexibility to cause something like forced AA to screw up).

IMHO control panel AA, AF, "adaptive AA", LOD biasing/clamping and arguably even gamma correction on AA resolve, trilinear/aniso optimization and a host of other things have no place in DX9/10+ applications and hardware. I cannot count the number of times these things have screwed over perfectly legitimate algorithms and caused users no end of pain. Seriously, I'm grateful that Microsoft has tightened up the spec on stuff like this in DX10 and beyond.

Sometimes they're clever enough to turn themselves off when they detect an incompatible style of rendering, but that's not always possible. Still, if you force stuff from the control panel, you're completely on your own... consider yourself lucky if you game runs let alone renders properly ;)
 
Yup, deferred rendering. It's one of many, many interesting and useful rendering styles that is incompatible with the driver trying to screw with your buffers and settings behind your back. You can tell that I'm not a fan of control-panel-forced stuff (except for super-old games that predate the API flexibility to cause something like forced AA to screw up).

IMHO control panel AA, AF, "adaptive AA", LOD biasing/clamping and arguably even gamma correction on AA resolve, trilinear/aniso optimization and a host of other things have no place in DX9/10+ applications and hardware. I cannot count the number of times these things have screwed over perfectly legitimate algorithms and caused users no end of pain. Seriously, I'm grateful that Microsoft has tightened up the spec on stuff like this in DX10 and beyond.

Sometimes they're clever enough to turn themselves off when they detect an incompatible style of rendering, but that's not always possible. Still, if you force stuff from the control panel, you're completely on your own... consider yourself lucky if you game runs let alone renders properly ;)
I have the opposite view. microsoft's specs are always below nvidia's. ati, generally doesn't add anything to ms's specs, and that's why I prefer nvidia. 2 big examples, are ati doesn't do full trilinear, the x1k series didn't have hw fp16 filtering, and couldn't do vertex texturing.

Several examples, i believe that dx10 does away with trss. then it only requires 4x msaa, and it doesn't require angle invariance for af nor unoptimized trilinear.

I liked it much better when hardware was unique and the specs weren't dictated by ms.

Both gpu makers should make their drivers and hardware so they don't conflict with any games. Microsoft is who screwed over bw compatibility. On the other hand, you play any opengl game and if it's in 32 bit color, then it improves with each new generation of hw. DX5,6,7 games are busted on later hw.

Think about how 3dfx was so much better than the competition. It was b/c they weren't limited by microsoft's stupid regulations.
 
microsoft's specs are always below nvidia's.
Not true - quite the opposite in fact. Note that NVIDIA currently has no DX10.1-supporting cards while ATI does. Now the main point here is that everyone conspires on the new DX specs, but having Microsoft involved helps to push things along and at the same time drive compatible features, which in this day and age is an absolute necessity. No one is going to ship a game that only works on one vendors graphics cards...

Several examples, i believe that dx10 does away with trss.
If you're talking about what I think you are, this was never officially supported in any API that I know of, and indeed is one of those silly backend hacks that works in a few cases and falls flat in a lot of other legitimate ones. Great for DX5/6/7/8 games, but silly beyond that.

then it only requires 4x msaa, and it doesn't require angle invariance for af nor unoptimized trilinear.
Indeed, but it doesn't prevent you from doing more in either of those cases. Even NVIDIA's CSAA modes are supported just fine through the DX interfaces.

I liked it much better when hardware was unique and the specs weren't dictated by ms.
As I mentioned above, those days are rightfully gone as it is unreasonable to develop software for a specific GPU. Conversely, rendering has gotten to the level of programmability and generality that drivers can no longer safely infer how rendering is being done, as most of stuff that generates an image now is application code, not fixed-function logic.

Both gpu makers should make their drivers and hardware so they don't conflict with any games.
A huge part of that is *not screwing with what the game is trying to do behind its back*! Otherwise the next game patch or driver update could very well break things again.

Microsoft is who screwed over bw compatibility. On the other hand, you play any opengl game and if it's in 32 bit color, then it improves with each new generation of hw. DX5,6,7 games are busted on later hw.
I agree that DX5/6/7 were poor 3D APIs compared to OpenGL, but OpenGL has since become so incredibly outdated and irrelevant that there's simply no argument anymore. (Also note that I specifically stated talking about DX9/10+ rendering engines.) If OpenGL was updated to what is required/expected by modern rendering engines, it would have exactly the same "issues" as DX here: drivers can no longer expect to pull nearly as much crap behind the application's back because most of the rendering is now handled by user code. Thus typically you're restricted to performance optimizations, because - and let me stress - *there is no reasonable and robust way to impose things like AA on a modern rendering engine*, which may well be doing *anything* with the data in its buffers/render targets. The days of "I fire off a bunch of triangles and textures to the GPU and it does all the rendering and gives me back a framebuffer" are long gone, thank goodness.

Think about how 3dfx was so much better than the competition. It was b/c they weren't limited by microsoft's stupid regulations.
Ironically, you're arguing against yourself here ;)
 
Last edited by a moderator:
Not true - quite the opposite in fact. Note that NVIDIA currently has no DX10.1-supporting cards while ATI does. Now the main point here is that everyone conspires on the new DX specs, but having Microsoft involved helps to push things along and at the same time drive compatible features, which in this day and age is an absolute necessity. No one is going to ship a game that only works on one vendors graphics cards...



If you're talking about what I think you are, this was never officially supported in any API that I know of, and indeed is one of those silly backend hacks that works in a few cases and falls flat in a lot of other legitimate ones. Great for DX5/6/7/8 games, but silly beyond that.

I know it was never supported by any api, but games shouldn't have severe alpha texture aliasing.

The x360 only does 4x rgms and things like grass shimmer like crazy.


Indeed, but it doesn't prevent you from doing more in either of those cases. Even NVIDIA's CSAA modes are supported just fine through the DX interfaces.


As I mentioned above, those days are rightfully gone as it is unreasonable to develop software for a specific GPU. Conversely, rendering has gotten to the level of programmability and generality that drivers can no longer safely infer how rendering is being done, as most of stuff that generates an image now is application code, not fixed-function logic.
Fixed function has advantages, though, b/c it requires less programming. It's just turned on or off. I wish shaders were only used for special fx. Look at how lousy aa looks on games that only do it thru shaders. With fixed function, the game doesn't suffer if the programmers are lazy. With everything being done by shaders, lazy programming, which is very common these days, ruins games.
A huge part of that is *not screwing with what the game is trying to do behind its back*! Otherwise the next game patch or driver update could very well break things again.


I agree that DX5/6/7 were poor 3D APIs compared to OpenGL, but OpenGL has since become so incredibly outdated and irrelevant that there's simply no argument anymore. (Also note that I specifically stated talking about DX9/10+ rendering engines.) If OpenGL was updated to what is required/expected by modern rendering engines, it would have exactly the same "issues" as DX here: drivers can no longer expect to pull nearly as much crap behind the application's back because most of the rendering is now handled by user code. Thus typically you're restricted to performance optimizations, because - and let me stress - *there is no reasonable and robust way to impose things like AA on a modern rendering engine*, which may well be doing *anything* with the data in its buffers/render targets. The days of "I fire off a bunch of triangles and textures to the GPU and it does all the rendering and gives me back a framebuffer" are long gone, thank goodness.


Ironically, you're arguing against yourself here ;)
 
because he could
biglol.gif
 
2 more questions:

1. if bionic commando has a dx10 mode, then would hw msaa and trss be possible?

2. is matrix path of neo supposed to work with aa? I only tried it with one driver (I forgot which one) a while back and aa wasn't applied, so I was wondering if anyone knew if it was just a driver issue, or if the game doesn't allow aa.
 
Fixed function has advantages, though, b/c it requires less programming. It's just turned on or off. I wish shaders were only used for special fx. Look at how lousy aa looks on games that only do it thru shaders. With fixed function, the game doesn't suffer if the programmers are lazy. With everything being done by shaders, lazy programming, which is very common these days, ruins games.
Good luck trying to do any half-decent graphics (by today's standards) using only fixed-function.

Go boot up your favourite DX7 game. Do you notice how it doesn't look anywhere near as good as Crysis does? Or, heck, go run Half-Life 2 and run it in DX7 mode (which is purely fixed-function). It'll look much worse than DX9 mode.

I'd also like to point out that fixed-function in modern hardware is emulated by shaders by the drivers. Fixed-function hardware doesn't even exist any more. So there's nothing inherently wrong with shaders that cause your so-called "lousy aa".
 
I'd also like to point out that fixed-function in modern hardware is emulated by shaders by the drivers. Fixed-function hardware doesn't even exist any more. So there's nothing inherently wrong with shaders that cause your so-called "lousy aa".
Yes indeed. The point here is that if a modern game has lousy AA, by all means fault the game (I do)! All of the "tools" of the fixed-function days are still available for the application to use, so it's not like they can't get as good "quality" as DX7 apps (in fact, it's quite trivial). What *has* changed is now it's more of the application building a rendered frame using the given tools rather than basically the GPU building a rendered frame with a few sliders for the application to play with. It's thus no longer appropriate (or safe) for the GPU to try and impose a hammer on the application when it may not even be using nails... or wood... or even building something.
 
Question: Why don't you use the forum's search?

Why don't games using the GRAW engine allow aa?
http://forum.beyond3d.com/showthread.php?t=36790
Could nvidia make a driver to allow any aa mode to be forced from the control panel?
They did it for STALKER (DX9).

Also, what HW/rendering technique/game engine limitation prevents control panel aa from working?
It's game developers, who use strange technologies like Deferred Rendering, and Microsoft/DirectX9, which don't allow to do MSAA on MRTs or to read back from Z-Buffer.
The funny thing is: Rainbow Six Vegas 2 uses UE3 and there is no problems to use MSAA without IHVs' hacks.


--------------------------------------------------------------->
What does DX10.1 add vs. DX10 that makes such a big difference? (deferred shading)
 
Dood... it's not a fixed function. It requires some programming to get it to do what you want.

ERROR: Can not find 'Forum Search' in "FIXED FUNCTION".

In all seriousness, your best off googling why, google is this great new service which allows you to search the internet, its kinda like a library for the internet without the fee's.
 
Very poor showing there gais. Ceilingcat has lost its faith in you.
 
The problem with forced AA or any other forced setting is that is changes the specified hardware behaviour. New games do much more than render visible pixels using the GPU. Only color buffers can be antialiased properly (and not even all the color buffers). Buffers containing for example normal vectors, object IDs, material IDs, indexes, memory addesses or any other discrete data cannot be properly antialiased. Nasty things happen if for example the driver forces your physics simulation render target buffer to be antialiased.

As the hardware or driver does not know what you are doing with the buffer, it cannot change the rendering behaviour.
 
Back
Top