Yeah, it not like Depth Test (together with Fog, Alpha Test and Stencil Test) is totally gone from the end of the pipeline however mundane they look in a discussion about PS 3.0.
Anyway, all this talk about pipelines and dynamically allocation of ressources got me thinking: As I understand it, the pipelines will always be bound to work on the same triangle (fragments send from the Rasterizer stage). It means that they would normally be working on the exact same shader program with the same textures and data etc. So unless a decent number of the pipelines are often idle, an array of units that is dynamically allocated may not make much sense: The fragments needs the same level of ops. power (and you need to add complex allocation silicon).
Vertex Shaders are a totally different ballgame of course: Here you're not bound by a something like the rasterizer stage, the ops are much less dependent on large data (like textures) and the data just keeps coming in a nice flow.
Anyway, all this talk about pipelines and dynamically allocation of ressources got me thinking: As I understand it, the pipelines will always be bound to work on the same triangle (fragments send from the Rasterizer stage). It means that they would normally be working on the exact same shader program with the same textures and data etc. So unless a decent number of the pipelines are often idle, an array of units that is dynamically allocated may not make much sense: The fragments needs the same level of ops. power (and you need to add complex allocation silicon).
Vertex Shaders are a totally different ballgame of course: Here you're not bound by a something like the rasterizer stage, the ops are much less dependent on large data (like textures) and the data just keeps coming in a nice flow.