How is FPS calculated?

infra_red

Newcomer
Ok, I know the title sounds stupid :rolleyes:

BUT, I was just wondering if games (and 3rd party programs like FRAPS)calculates the game fps based on the windows clock?

I have noticed that the windows clock is not entirely accurate. At every 5 second interval, it pauses for a bit longer than usual, and I believe this yields a higher FPS.

As my knowledge on this matter is quite minimal, please forgive me if I am being a total idiot :?
 
The windows clock is not used. The most accurate way is usually to use the processors internal cycle counter (using a special assembly instruction), but this can give you problems if your app is running on a laptop with SpeedStep or similar technology since then the processor frequency can vary with load.
The other viable alternative is to use the performance counter which is available through the windows API and runs from the PCI bus clock (I think). But that can be buggy on certain chipsets, causing it to jitter large amounts once in a while. There are other methods available but they generally have too low resolution. Timing is suprisingly hard.
 
GameCat said:
The windows clock is not used. The most accurate way is usually to use the processors internal cycle counter (using a special assembly instruction), but this can give you problems if your app is running on a laptop with SpeedStep or similar technology since then the processor frequency can vary with load.
The other viable alternative is to use the performance counter which is available through the windows API and runs from the PCI bus clock (I think). But that can be buggy on certain chipsets, causing it to jitter large amounts once in a while. There are other methods available but they generally have too low resolution. Timing is suprisingly hard.

If you want to use the processors internal cycle counter then you also have to find out what frequency that processors has, which I don't know any other way to do than to count those cycles over a given period of time (e.g. 1 second) which brings us back to having to use the PerformanceCounter (that's valid for the Win32 API)


Excuse my crappy english
 
so PerformanceCounter is the simple correct answer (to "how to get accurate timing on win32")? I think it was last time I checked.
Does it do anything wrt. speedstep etc. nowadays, or just ignore the possibility of variable clock on the host?
 
I use the PerformanceCounter as well. I also always compute the average per second, and the slowest rendered frame in that second. This gives a more accurate view of how smooth things run. Some games or graphics card drivers artifically bump the framerate by doing some work only every few frames, but this results in irregularities in rendering speed that visually makes it less smooth.
 
I think an interesting question is: how should one actually calculate the average framerate?

You could, for example, add up all the frames rendered, and divide by the total time. That would be an average. (N/(t1+t2+...+tN)) This method, the most obvious one, would give a greater weight to frames that take a long time to render.

Or you could record the framerate on a per-frame basis, and average that quantity.
((1/t1+1/t2+...+1/tN)/N) This method gives a greater weight to frames that take a short time to render, and would be more accurately described as the "average frame rate."

Or, if you wanted to, you could average the quantity ln(t) for each frame. This logarithmic average would give you a result that would give equal weight to both slow and fast frames. (full eqn: e^((ln(t1)+ln(t2)+...+ln(tN))/N)

Each method would give you a different result.

Regardless, I think the biggest help for determining just how well a computer renders a game would be to give the standard deviation of the quantity averaged, which could be described as a measurement of the percentage of frames rendered outside some band from the average.

For example, if a benchmark averages 60fps with a standard deviation of 10fps, then I could say that ~84% of the time the framerate was above 50fps.
 
Chalnoth said:
Regardless, I think the biggest help for determining just how well a computer renders a game would be to give the standard deviation of the quantity averaged, which could be described as a measurement of the percentage of frames rendered outside some band from the average.

For example, if a benchmark averages 60fps with a standard deviation of 10fps, then I could say that ~84% of the time the framerate was above 50fps.

This is true if rates are normally distributed.
An even better thing would be to plot a cumulative distribution chart for say a FRAPS run, essentially similar to what [H] does, but ordered from lowest to highest FPS and plotted vs %. That way you could see what fraction of the time you were above any given FPS, independent of the statistical behavior.

ERK
 
Back
Top