Aliasing

Diplo

Veteran
This might seem like a stupid question, but I'll ask it anyway:

Do different 3D engines running at the same resolution and same settings (texture resolution, AF etc) produce differing levels of aliasing? In other words, can some 3D engines, running at a given resolution, produce more or less aliasing than another engine? (I'm obviously assuming no forced AA).

I ask because I often see on many forums people saying, "Wow! The jaggies are bad on that game" or "shame about the jaggies that engine produces". Now I'd always assumed that if screen resolution and texture detail were the same then, all things equal, the amount of visible aliasing would be the same regardless of the way an engine renders a scene. So, to my mind, blaming the game or engine itself for jaggies seemed wrong.

Having said that I did notice that 'Doom 3' seemed to produce less visible aliasing at 800x600 then 'Half Life 2' running at 1152x864. I presumed this was simply due to D3 being much darker and having lower resolution textures, where as the sharp-textures and brightness of HL2 made the 'jaggies' more evident. Is this correct? Or is there more to it than that?
 
I don't think it has anything to do with texture quality or the engine itself. It's just the number of high contrast edges in the scene IMO. HL2 has a lot more of these than Doom3.
 
Tho theoretically the narrowing of engine choices ought to make it easier to address the aliasing problem, right?
 
I would have to go with Doom 3's darkness as being the main factor, with its very "busy" environment as a secondary cause. Low contrast, darker pixels are not going to show much aliasing, and aliased edges that are well lit are going to blend in better if the pixels on either side are more varied in color.
 
You can lower aliasing visibility if you choose textures with similar coulours for all objects (e.g. you can't see gray jaggies on gray background).

So I agree with trinibwoy and Crusher.
 
Count the spots where HL2 has alpha tests in each scene and the answer is easy.
 
CosmoKramer said:
I'm surprised noone has mentioned the lodbias.
Be my guest :)

Seriously, I'd like to hear if there are any more technical reasons (apart from large contrast) that influences whether aliasing seems more apparent or not. Is it something developers can consciously help avoid by using certain techniques etc. ?
 
Well, theoretically you could make a completely jaggie free game.
Use only straight horizontal and vertical lines. And don't rotate them no matter what, like even if the player moves around.
 
Diplo said:
This might seem like a stupid question, but I'll ask it anyway:

Do different 3D engines running at the same resolution and same settings (texture resolution, AF etc) produce differing levels of aliasing? In other words, can some 3D engines, running at a given resolution, produce more or less aliasing than another engine? (I'm obviously assuming no forced AA).
Sure do, but not for everything. There are a great number of things that affect aliasing. You've got edge aliasing. You've got surface aliasing (which I'll define as aliasing produced by a pixel shader or an alpha test). And you've got texture aliasing.

Edge aliasing is a function of level design and of the use of FSAA. This, then, is largely independent of the game engine, but highly dependent upon the artwork and upon the 3D hardware used.

Texture aliasing is a function of the contrast levels of the textures, and of the LOD bias selected by the program. This is marginally a function of the engine, and mostly a function of texture art. Unfortunately, different hardware also has different levels of aliasing with the same settings, so it's rather hard for game developers to minimize texture aliasing while maximizing texture quality.

Surface aliasing is where game engines can really differ, though. So far the most that I've seen is in UT2004. This game uses a mixture of alpha blends and alpha tests to provide for less aliasing than you'd get with a simple alpha test.

Other possibilities include optimizations that use normal texture filtering for operations where it wasn't originally intended. One example is bump mapping. If you filter a bump map, you'll get an average direction where the light should point, and the length of the normal vector will also no longer be equal to one.

To see why this is, think of an extreme case: imagine two neighboring vectors in the texture being averaged together through filtering. These vectors are pointing nearly in opposite directions, so the filtered vector has a length close to zero, with a direction given by the very small net direction of the two vectors. Now, you can do studies of what these situations should look like as a function of the length of the resultant vector. This has in fact been done, and leads to an approximation of MIP mapping for bump maps.

Another option has to do again with alpha-tested textures. See Humus' recent demo on alpha coverage.

So, there are many ways that game engines can affect aliasing, but not all types.
 
Chalnoth said:
Edge aliasing is a function of level design and of the use of FSAA. This, then, is largely independent of the game engine, but highly dependent upon the artwork and upon the 3D hardware used.

Could you expand on this a bit. What exactly about the artwork impacts the level of aliasing? And the 3D hardware dependency is only when FSAA is enabled right?
 
trinibwoy said:
What exactly about the artwork impacts the level of aliasing?
I'm guessing it's to do with the contrast between colours and brightness - for example, bright white characters in the foreground against a pitch-black skybox would probably look more aliased than ones that blended in more. I remember noticing this a lot in Hl2 when you had dark telephone wires against a bright azure sky.
 
Since edge aliasing happen on edges (duh!), so a scene with more edges is likely to have more aliasing. For example, a sphere does not have much aliasing, but a spiky ball has a lot.
 
trinibwoy said:
Could you expand on this a bit. What exactly about the artwork impacts the level of aliasing? And the 3D hardware dependency is only when FSAA is enabled right?
Well, City of Heroes is an excellent example. If you've played it, you know that you can put your anti-aliasing as high as you want, and you're likely to still notice aliasing.

More generally, there is typically lots of aliasing on edges that are both high-contrast and very regular. A good example would be a wide stairway viewed from a great distance. The stripes that make up this stairway will render any edge AA algorithm pretty much useless if viewed from a great enough distance: once the steps start approaching pixel-size you'll start getting very distracting curved aliasing patterns.
 
pcchen said:
Since edge aliasing happen on edges (duh!), so a scene with more edges is likely to have more aliasing. For example, a sphere does not have much aliasing, but a spiky ball has a lot.
Well, more important is the type of edges than their number. The more regular and the higher in frequency the edges are (closer to pixel size), the greater the problem for antialiasing algorithms.
 
Right now. I'm playing Bloodrayne 2 at 1280x960, with 4x AA set through the ATi CCC, and the game looks jaggy as hell.

It sucks because I'm taking the performance hit that 4x AA would normally incur, but not getting the results I would expect
icon_evil.gif


rayne220050809161550790vm.jpg
 
This will happen if the game uses a render-to-texture for most rendering, and later does a pass where the texture is then read in and applied to the full screen. While this is a staple for HDR rendering, it is also used for bloom effects in general. If Bloodrayne 2 has any bloom effects, this is probably what's happening, and you'll only get AA if it's supported within the application.
 
Chalnoth said:
This will happen if the game uses a render-to-texture for most rendering, and later does a pass where the texture is then read in and applied to the full screen. While this is a staple for HDR rendering, it is also used for bloom effects in general. If Bloodrayne 2 has any bloom effects, this is probably what's happening, and you'll only get AA if it's supported within the application.

Well, it's possible to render everything to the framebuffer (with or without AA), StretchBlt the result to a texture, and apply the bloom at the end.
This happens to be the fastest solution - and it supports AA.

So why couldn't games do AA and bloom at the same time?
 
Back
Top