What is the depth complexity in todays games? [overdraw]

Certainly, it does depend on your point of view.

For me, the interest is "what's the cost to the hardware" - it's irrelevant to me whether each individual surface is multipass, I just want to know how many times the engine processes each pixel (or more or less coarse granular unit).

I'd say a 'purely geometry based' definition does count additional passes: if you intercept the draw calls and calculate the area drawn, you'll see every pass (which is why I said that was the easiest method to calculate). Comparing and rejecting duplicate geometry is, well, cumbersome :D

You'd be astonished how many games don't do loose front-to-back sorting... if I never see another game that draws the sky first I'll be a happy man.
 
Some of the definitions of overdraw being used here are getting confused because they are being used in relation to older PowerVR documentation. I remember some PowerVR talk where MultiTexture / Multipass overdraw was counted as this was a win for PowerVR as their hardware had the capability of applying more textures per pass and also automatically collapsing passes due to the binning process. Of course, with more modern hardware these elements are largely negated, assuming the application will make use of it (although, even though older titles may be fixed to two textures per pass I'm not sure we are going to be concerned about the performance of those titles with the likes of 6800 and X800 around!).
 
Dio said:
I'd say Q3 would typically be between 3 and 4, the increase mostly due to additional passes and more blend effects. As for Doom3 I'd guestimate 20-40 overall, about 90% of this being Z/stencil traffic only.
Just curious, are there any measurements you base the D3 numbers on, or just a ballpark guesstimate?
Anyway, just to throw in some actually measured numbers - I know there's a fair number of PS2 games with overdraws around 20 or above. Highest I'm aware of was around 40 though (IIRC mostly large particles overdraw).
 
Dave B(TotalVR) said:
How about a little app that sets your GPU to just incriment a pixels value by 1 each time it would be written too instead of writing the colour value. Then you can take a screenshot, sum all the pixel values and divide it by the number of pixels to get the overdraw.
Except that when you modify the command/data stream, you are not guaranteed to get the same results, due to driver optimizations etc. It would possibly give a rough estimate.
IMO the definition of overdraw is somewhat fuzzy nowadays. At the simplest, you could calculate overdraw simply based on polygons sent to the API, but we know that lots of it wont get rendered because of culling and overdraw reduction features. On the other end of the spectrum you would calculate only pixels/fragments drawn with full features, i.e. fully lit and shaded. This number is hard to get, as we dont know how smart the rendering pipeline is internally.
Somewhere in between are the solutions with first Z-pass and others.
Another issue, should shadow polygons counted as overdraw ? Should transparent polygons be counted, as those are not actually "wasted" pixels but contribute to the final image ?
In short, someone better write down a good definition on what exactly is overdraw :)
 
It should be possible to do this scientifically. I know there are a few software rasterizers out there, so it should be fairly simple to render a few frames of your favorite application to find out.

I assume the rasterizer would cull failing Z pixels, so is should be trivial to capture the number of trivially rejected and accepted pixels.

Of course tex_kill in the fragment program will mess this up....
 
Chalnoth said:
Ailuros said:
Unless my memory betrays me I think that D.Vogel said somewhere that overdraw in the UE2 based UT200x games is around 4-5, but can reach also levels around 7.0.

I'd have a different question though: what would the estimated average level of opaque overdraw be in today's games? Somewhere around 3.0 maybe?
Well, if the UE2 engine does any sort of depth sorting at all (even coarse), I'd rather doubt an actual rendered overdraw of 4-5 would be standard (on modern video cards with early-z and stuff, that is).

Edit: In fact, since I believe the game still uses a portal-based rendering system, I think some coarse depth sorting is implicit.

I recall a claimed average overdraw between 4 and 5.0 for the original Serious Sam already. I also recall reading that Q3a's demo001 having a measured overdraw of something like ~3.4 (with early-Z it being reduced to ~3.1).

Having those numbers in mind I wouldn't think that the claim of between 4-5 (and occasional peaks up to 7.0) could be unrealistic for UT2003/4.

Completely OT: does anyone have an idea why performance is so abysmally bad on all accelerators in ONS-Primeval/UT2004? Did some sort of artist code that one for kicks?
 
Back
Top