Chalnoth said:Yes there is. The framerate for one frame is the inverse of the amount of time it take to render that frame.
again there's not such a thing as 'average [framerate] per a single frame'. just the same way as there's not an average number of Jon's per single Jon. yes, it's computable. no, it does not make it any more meaningful - it's a measure that measures nothing. furthemore, how you read that into my original post is beyond my comprehension.
Possibly. But for GPU limitation, it makes more sense to just look at the framerate in the way I described it (averaging over frames instead of times).
those frames you want to measure over take up a certain amount of time - whether you average framerate over n frames or over the time of those n frames is the same - it's a matter of convention.
CPU limitations may be different, of course. But since CPU limitations will merely cause this method of measuring framerate to result in too low an average, this isn't that much of a problem.
aha. and what was the relevance of you introducing timedemos in this discussion of framerate? you have n frames for t time (usually measured in 'frames per second') - where those come from is totally irrelevant (could be your vcr just as well).
Well, of course, but my point was that in a timedemo, each frame represents the same fixed timespan. This is not going to be the case in-game (and even if it happened to be due to very special circumstances, both methods will report the same average, so it's a mute point).
again, the relevance to my original post is nil. my original post was about 'average framerates vs stable/minimal framerates'. but hey, in case you're looking for an intellignet discussion open up a new thread on whatever topic amuses you and i promise to participate ; )