With the advent of technologies such as hw occlusion culling, HW T&L (therefore very high poly counts), per pixel lighting etc, the given priorities for state/depth/vis sorting has changed.
In the past if you were using a bsp based indoor engine you would sort by depth (either front to back or vice versa depending on your usage). In higher poly scenes you would sort by texture/render state first.
How do you feel this has changed (if at all).
I have found this to be the best case for mixed indoor outdoor rendering:
sort by:
handle transparancy last as depth sorted first but for opaque polys:
assuming your choice of visibility algo:
1) vertex shader / pixel shader combo
2) decal / normal map
thoughts?
In the past if you were using a bsp based indoor engine you would sort by depth (either front to back or vice versa depending on your usage). In higher poly scenes you would sort by texture/render state first.
How do you feel this has changed (if at all).
I have found this to be the best case for mixed indoor outdoor rendering:
sort by:
handle transparancy last as depth sorted first but for opaque polys:
assuming your choice of visibility algo:
1) vertex shader / pixel shader combo
2) decal / normal map
thoughts?