Frame Rate Analysis Thread (Simple Rules Post #2)

Status
Not open for further replies.
I've got a range of CineForm compressed MGS4 clips based on gameplay taken from the first act:

Clip #1: 1566 frames, 1001 dupes, 21.623fps
Clip #2: 1825 frames, 1098 dupes, 23.882fps
Clip #3: 1032 frames, 601 dupes, 25.024fps
Clip #4: 893 frames, 486 dupes, 27.310fps
Clip #5: 773 frames, 386 dupes, 30fps
Clip #6: 813 frames, 437 dupes, 27.709fps
Clip #7: 1241 frames, 628 dupes, 29.613fps
Clip #8: 727 frames, 443 dupes, 23.389fps
Clip #9: 1197 frames, 598 dupes, 30fps
Clip #10: 1113 frames, 591 dupes, 28.112fps
 
Last edited by a moderator:
I think we need to decide on a better way to represent the framerate (better than an average), and we also need to explain the effects of vsync on the figures too.

For example, for a vsynced game, if a frame takes just over 33.2ms, the framerate will immediately jump to 20fps, even if without vsync it would only have dropped to 29fps. So an average figure like the ones quoted above is pretty meaningless IMO.

Is it possible to deduce the rendering time of a non-vsynced game based on the location of the screen tear?
 
For example, for a vsynced game, if a frame takes just over 33.2ms, the framerate will immediately jump to 20fps, even if without vsync it would only have dropped to 29fps. So an average figure like the ones quoted above is pretty meaningless IMO.
I don't like this characterization, as you should describe it like this for a 34.5 ms render:

Without V-sync, 10% of the frame will be 20 fps and 90% will be 30 fps. A tear divides the regions.

With V-sync and double buffering, you will see 20 fps.

With V-sync and triple buffering, you have a 10% chance to see 20 fps and 90% chance to see 30 fps, and there is no tear.

Is it possible to deduce the rendering time of a non-vsynced game based on the location of the screen tear?
That will average out in these numbers from grandmaster. Lack of V-sync doesn't affect the accuracy of his measurement unless it's at greater than 60 fps (which would be stupid). Location of the tear only matters if you want to do something with instantaneous framerate.
 
I've got a range of CineForm compressed MGS4 clips based on gameplay taken from the first act:

Clip #1: 1566 frames, 1001 dupes, 21.623fps
Clip #2: 1825 frames, 1098 dupes, 23.882fps
Clip #3: 1032 frames, 601 dupes, 25.024fps
Clip #4: 893 frames, 486 dupes, 27.310fps
Clip #5: 773 frames, 386 dupes, 30fps
Clip #6: 813 frames, 437 dupes, 27.709fps
Clip #7: 1241 frames, 628 dupes, 29.613fps
Clip #8: 727 frames, 443 dupes, 23.389fps
Clip #9: 1197 frames, 598 dupes, 30fps
Clip #10: 113 frames, 591 dupes, 28.112fps

Seems quite accurate. The game suffers from quite a bit of slowdowns. This especially in firefights and with dust/smoke.
 
I don't like this characterization, as you should describe it like this for a 34.5 ms render:

Without V-sync, 10% of the frame will be 20 fps and 90% will be 30 fps. A tear divides the regions.

With V-sync and double buffering, you will see 20 fps.

With V-sync and triple buffering, you have a 10% chance to see 20 fps and 90% chance to see 30 fps, and there is no tear.

Yes, I was only talking of double buffering in my example, to illustrate the point.

Your method of representing the framerate seems to me to be a lot better than a straight average. Especially as in some sections the framerate in MGS4 can hit 60fps, 30fps, 20fps etc. Maybe a better way of reporting it would be:

xxxx frames:
1VBL: xxx frames (i.e. 60fps)
2VBL: xxx frames (i.e. 30fps)
3VBL: xxx frames (i.e. 20fps)
etc, if required.

That will average out in these numbers from grandmaster. Lack of V-sync doesn't affect the accuracy of his measurement unless it's at greater than 60 fps (which would be stupid). Location of the tear only matters if you want to do something with instantaneous framerate.

My thinking was, the application grandmaster uses to get the framerate could quite easily use the tear point to calculate instantaneous framerates for each frame. These could then be overlaid on the video to match up framerate and on-screen action at any given time. It would be a nice illustration of exactly what is in the video and what is causing the framerate to dip/jump as the game is played.
 
I've got a range of CineForm compressed MGS4 clips based on gameplay taken from the first act:

Clip #1: 1566 frames, 1001 dupes, 21.623fps
Clip #2: 1825 frames, 1098 dupes, 23.882fps
Clip #3: 1032 frames, 601 dupes, 25.024fps
Clip #4: 893 frames, 486 dupes, 27.310fps
Clip #5: 773 frames, 386 dupes, 30fps
Clip #6: 813 frames, 437 dupes, 27.709fps
Clip #7: 1241 frames, 628 dupes, 29.613fps
Clip #8: 727 frames, 443 dupes, 23.389fps
Clip #9: 1197 frames, 598 dupes, 30fps
Clip #10: 1113 frames, 591 dupes, 28.112fps

a bit all over the place with reports of it hitting 60fps someitmes, i'd rather have a solid albiet lower framerate
 
Seems quite accurate. The game suffers from quite a bit of slowdowns. This especially in firefights and with dust/smoke.

Actually the game is quite smooth IMO, I've seem no talk about the frame rate in any forums either. I think you are reading what you want into the numbers.



How does repeated frames and the average being 30fps work? Occasional jumps to 60fps?

It would be interesting to get numbers for known best case/worst case games to use as comparisons.

Good (30fps): Halo 3, Resistance 1

Bad: Saints Row, Mass Effect

Now a game with 26fps might look pretty good compared to ME with 16fps. Just a thought.
 
Actually the game is quite smooth IMO, I've seem no talk about the frame rate in any forums either.

I had first hand experience with the game but videos tell it also. Combat with explosions and dust etc causes slowdowns. They even ocure in non combat parts in different places.

I think you are reading what you want into the numbers.

The numbers back up what I am saying. Its obvious, if not then one dont really know how to see framerate differences unless they are very large. That person might though consider himself lucky that finds 15-25fps smooth with no motionbur in effect at moment! ;)
 
It would be better if some contexts about the clips are given. Are they all cutscenes ? Which ones ? (At least 2 are pre-rendered)

It is said on the net that there are some frame drops during cutscenes, but the gameplay is generally smooth (reaching 60fps in small rooms at times). Is this true ? (e.g., What did you do in-game to make framerate drop more)
 
I've seen for myself not only a 60 fps refresh in rooms, but even the gameplay engine jumping up with Snake whizzing about like a Theme Park visitor after a cup of coffee! I found the fluctuations of MGS4's framerate perhaps the most disconcerting aspect visually.
 
Though MGS4 was going for 60fps (though I haven't seen the game yet) whats so demanding about the game for it to drop to almost 20 fps? Maybe this is the wrong thread to ask this if so forgive me.
 
I don't think Konami said anything about MGS4 being 60fps. IGN did publish a Q&A that claimed that some segment of MGS4 played at 60fps.
 
a bit all over the place with reports of it hitting 60fps someitmes, i'd rather have a solid albiet lower framerate
you would think the framerate would be annoying considering it drops below 30fps and goes above 60fps at times, but i rarely had any problems with the framerate, and i'm sensitive to framerate drops. the only thing i found weird about the framerate, was when it was 30fps, then quickly jumped to 60fps... it seemed like the camera was moving a lot faster when you move the analog stick. other than that, the framerate was fine.
 
Though MGS4 was going for 60fps (though I haven't seen the game yet) whats so demanding about the game for it to drop to almost 20 fps? Maybe this is the wrong thread to ask this if so forgive me.
sometimes you wonder why, but there are many times in the game where there are many things going on the screen.
 
It would be better if some contexts about the clips are given. Are they all cutscenes ? Which ones ? (At least 2 are pre-rendered)

It is said on the net that there are some frame drops during cutscenes, but the gameplay is generally smooth (reaching 60fps in small rooms at times). Is this true ? (e.g., What did you do in-game to make framerate drop more)

My post is quite clear. They are gameplay clips taken from the first act of the game. The measurements are also accurate in that I can move through the captures frame by frame and physically see the repeated frames. They are not captures of cut-scenes.

I have not taken any captures of wandering about in a small empty room - I mean, what's the point - but I can confirm that there are momentary jumps to 60fps in isolated circumstances, and the slowdown elsewhere is self-evident if you're sensitive to this kind of thing.

you would think the framerate would be annoying considering it drops below 30fps and goes above 60fps at times, but i rarely had any problems with the framerate, and i'm sensitive to framerate drops.

I would agree with this, the jumps in frame rate are not a particularly big issue for me in MGS4.
 
My thinking was, the application grandmaster uses to get the framerate could quite easily use the tear point to calculate instantaneous framerates for each frame.

No, I'm afraid it can't. It can detect the number of unique frames in a clip. The sample itself can be from any area of the screen, but it has to be the same area for the whole clip. It can't detect *where* the tear point is.

The best it could do would be to do a running average over a set number of frames, 60 would be the obvious choice. Then I'd have to figure out some way of overlaying those figures onto the video in After Effects, which in itself would be challenging - a TIFF sequence maybe, perhaps an animated GIF?
 
yes, you would can calculated the framerate for every second of the video (for every 60 frames packet) and make a little graph with some dot
 
Status
Not open for further replies.
Back
Top