Am I one of the few people who are unimpressed by antialiasing?
Given a choice between running a game at 1600*1200, 1280*960 with 2x FSAA, or 1024*768 with 4x FSAA, I'd almost always choose 1600*1200. (with the exception of Homeworld/Cataclysm, which looks much better with AA)
The problem, IMHO, is that in most games I can't even tell the difference between aliased and antialiased unless it's a still screenshot; when things are moving, jaggies are much harder to see. At resolutions 1280*960 and higher, jaggies are too small to even notice. Moreover, 2x antialiasing really doesn't do much, while Quincunx is really blurry and 4x is slow. On the other hand, the increased detail of a higher resolution is obvious whether it's a screenie or in-game.
When jaggies are really only a problem in a small subset of games (driving and flight sims, Homeworld), why do videocard makers spend so much effort making improved antialiasing schemes? Perhaps antialiasing will be worthwhile on the R300/NV30 generation of hardware, simply because they'll have plenty of framerate to burn, but on a GF4Ti4400 OC'ed to -Ti4600 speed, antialiasing is a rather severe hit to framerate in newer games. Sure, you can turn on the 4xS for Half-Life and Quake 3, but when you get to Max Payne or America's Army, turning on antialiasing means an ugly choice between unplayable framerate and lowering the resolution. And considering R300/NV30, I doubt that even they will be able to play Unreal 2 or DOOM 3 at maxed out settings and FSAA, while maintaining 60+ fps.
Most of the time, it's texture aliasing that annoys me. 4xMSAA in HW:C makes ship edges look wonderful, but it does nothing to get rid of the incredibly annoying crawly textures on the mothership. Why do some games set their LOD bias so aggressively? Grrr...
Given a choice between running a game at 1600*1200, 1280*960 with 2x FSAA, or 1024*768 with 4x FSAA, I'd almost always choose 1600*1200. (with the exception of Homeworld/Cataclysm, which looks much better with AA)
The problem, IMHO, is that in most games I can't even tell the difference between aliased and antialiased unless it's a still screenshot; when things are moving, jaggies are much harder to see. At resolutions 1280*960 and higher, jaggies are too small to even notice. Moreover, 2x antialiasing really doesn't do much, while Quincunx is really blurry and 4x is slow. On the other hand, the increased detail of a higher resolution is obvious whether it's a screenie or in-game.
When jaggies are really only a problem in a small subset of games (driving and flight sims, Homeworld), why do videocard makers spend so much effort making improved antialiasing schemes? Perhaps antialiasing will be worthwhile on the R300/NV30 generation of hardware, simply because they'll have plenty of framerate to burn, but on a GF4Ti4400 OC'ed to -Ti4600 speed, antialiasing is a rather severe hit to framerate in newer games. Sure, you can turn on the 4xS for Half-Life and Quake 3, but when you get to Max Payne or America's Army, turning on antialiasing means an ugly choice between unplayable framerate and lowering the resolution. And considering R300/NV30, I doubt that even they will be able to play Unreal 2 or DOOM 3 at maxed out settings and FSAA, while maintaining 60+ fps.
Most of the time, it's texture aliasing that annoys me. 4xMSAA in HW:C makes ship edges look wonderful, but it does nothing to get rid of the incredibly annoying crawly textures on the mothership. Why do some games set their LOD bias so aggressively? Grrr...