Hi all,
I wish to design an experiment to measure the proportion of time spent rasterizing triangles in a modern game. I believe it will be quite a small number due to (a) highly evolved multiple rasterizers in GPUs, and (b) aggressively optimized draw calls from game engines, but I wonder if multiple deferred passes and tessellated primitives have a negative impact. I'm also curious what Beyond3D members think about this.
So, assuming access to the source of a Direct3D 11 engine, how would you isolate and measure the time spent in the rasterizer?
Are there tools (e.g. PIX) that can make this measurement easier?
Is there way to double the rasterizer load without affecting other parts of the pipeline?
Thanks in advance.
I wish to design an experiment to measure the proportion of time spent rasterizing triangles in a modern game. I believe it will be quite a small number due to (a) highly evolved multiple rasterizers in GPUs, and (b) aggressively optimized draw calls from game engines, but I wonder if multiple deferred passes and tessellated primitives have a negative impact. I'm also curious what Beyond3D members think about this.
So, assuming access to the source of a Direct3D 11 engine, how would you isolate and measure the time spent in the rasterizer?
Are there tools (e.g. PIX) that can make this measurement easier?
Is there way to double the rasterizer load without affecting other parts of the pipeline?
Thanks in advance.