*spin* another 60 vs 30 framerate argument

And if the difference of making a turn or crashing is whether you brake at distance 2 (success) or distance 3 (failure) then you probably braking way too late anyway, because I highly doubt that racing video games routinely create situations where your margin for error is a window of 16 ms. Thats probably on the driver not the game or its framerate.

You haven't ever played any racing sims, have you?

The jump from Forza (30Hz) to Forza 2 was huge, not because of the increased resolution, but because of the full framerate in Forza 2.

Cheers
 
Your brain can't perceive individual frames at anything higher than 10-15 fps, so at 30fps your brain isn't interpreting the distance travel between 0 and 6 as 0, 3, 6. Its perceiving a smooth transition between 0 and 6 with higher frame rates providing a smoother transitions.

If that is the case, how can we perceive the difference between 30 and 60 fps?
 
You haven't ever played any racing sims, have you?

The jump from Forza (30Hz) to Forza 2 was huge, not because of the increased resolution, but because of the full framerate in Forza 2.

Cheers

I played the first 3 forzas and the framerate wasn't the only thing that doubled.
 
I don't understand how people are arguing that 30fps is good enough and yet in the Power vs Price thread they want to as much power as they can get. More power yet same old performance. wasting it on post processing? 8 enemies on screen?
 
If that is the case, how can we perceive the difference between 30 and 60 fps?

Can you perceive the difference between 50 and 100 pounds? How about 50 versus 51 pounds?

60 fps is smoother than 30 fps as I said above, it doesn't mean the brain is interpreting images as individual frames.

When you go to brake and take a corner, your brain has determined the optimal time to initiate those actions well before you have reached that point. The speed of the physic engine and overall response time is way more important in racing games than the framerate.
 
Last edited by a moderator:
When you go to brake and take a corner, your brain has determined the optimal time to initiate those actions well before you have reached that point. The speed of the physic engine and overall response time is way more important in racing games than the framerate.

You can see more clearly what stage of the corner you are at 60 fps. And taking the corner is more about how much you should steer than when you start breaking, and that is helped a lot by 60 fps.
 
Can you perceive the difference between 50 and 100 pounds? How about 50 versus 51 pounds?

60 fps is smoother than 30 fps as I said above, it doesn't mean the brain is interpreting images as individual frames.

Since we can see a difference between 30 and 60 fps we humans obviously interpret the extra frames. But as far as I know no one understand exactly HOW we do it.
 
2 things. You don't understand gaming mechanics with respect to lag. The lag you talk about in terms of online does not work at all as you think. More on this later.

1. Higher frame rate gives you more visual clues, making you able to for example adjust aim faster, or predict when that square of yours is hitting the middle. Not like you say just astethics.If a game runs at 60 fps in a shooter, you can more accurately aim (you get visual cues about your aiming twice as fast)
Although I stand by my example of a lower framerate not hindering accuracy of a clearly predictable, linear event like a moving cube, after reading your post I did reconsider the experience of the FPS and in particular the frequent camera rotations. Spinning the camera is definitely going to be far clearer and less disorientating at higher framerates. I also suppose that at higher framerates there will be better cues for fine adjustments when tracking a player as you describe, and you'll get feedback on your thumb control enabling better adjustments. So I guess I'm changing my opinion, although I've always been an advocate of higher framerates. ;)

Furthermore, most gaming TVs that are decent have less than 16-30ms lag...
Sure, but developers typically aim for the lowest common denominator. Joe Gamer buying COD is likely on a laggy TV not in game mode, if it has one. In that respect the added latency of 30 fps won't bother them IMO, where it is going to jar with high level players.

2. Further most online games today work in a matter where your aiming and movement is not really bound by Internet lag in the way you think...(description of local syncing)
I did not fully appreciate that. Is that universal? I've seen enough online games this gen where shots didn't register that I find it hard to believe. Obviously there's net code of sorts ironing out the sync issues (hence some games get praised for their net code), but there's certainly a notable affect of lag on gaming such that changing servers can provide a better experience. I'd like to know what internet lag really adds to real lag across different games, in relation to screen refresh lag.
 
Wow. You must have a poor opinion of a human's ability to track movement and predict its own position in space. 60 fps isn't going to help you take corners "way" better than 30 fps. Your brain is well ahead of anything being drawn on screen as its constantly predicting object movements and determining their future positions. The frames are providing visual feedback and the more the merrier (in terms of error correction) but the frame rate threshold needed to go from being a bad driver to a good driver doesn't sit between 30 and 60 fps.

Your brain can't perceive individual frames at anything higher than 10-15 fps, so at 30fps your brain isn't interpreting the distance travel between 0 and 6 as 0, 3, 6. Its perceiving a smooth transition between 0 and 6 with higher frame rates providing a smoother transitions.

And if the difference of making a turn or crashing is whether you brake at distance 2 (success) or distance 3 (failure) then you probably braking way too late anyway, because I highly doubt that racing video games routinely create situations where your margin for error is a window of 16 ms. Thats probably on the driver not the game or its frame rate.

I'm not talking about frame rendering time in the context of perception of movement transition and reaction times of the human eye/brain, but about the relation of distance traveled and speed within the physical simulation of the game. An object moving at high speed and being calculated at lower frame rate will output a distance traveled higher between each frame than when being calculated at a higher frame rate. Lets assume you'r watching the result of an offline rendering project of an object moving horizontally in a straight line and your field of view is static and limited to a 30 meter distance. In that scene the object moves at an initial and constant 30 meters per second and the result saved in an extremely low frame rate of 3 fps, that means it calculated the distance traveled only 3 times (needless to say the monitor refresh rate could be 60 or 120+ wouldn't matter), you would see said object position at meter 10, 20 and 30, each frame capturing a distance of 10 meters. Now what would be the output if the object was moving at 60 meters per second and still calculated/rendered at 3fps? We wouldn't see it traveling said distance 3 times, but only 1 at meter 20, 2nd frame object would be at a distance of 40meters outside of the field of view and 3rd frame would be at a distance of 60 meters. Now suppose after an upgrade it could render at 6fps and the object speed is still 60 meters per second, then the output would show the object again at position 10 at frame 2, 20 at frame 4 and 30 at frame 6. Now extrapolate this to 30fps and 60fps.

edit: Im very sleepy but i think i got the logic straight, or not and i just embarassed myself :p
 
Sure, but developers typically aim for the lowest common denominator. Joe Gamer buying COD is likely on a laggy TV not in game mode, if it has one. In that respect the added latency of 30 fps won't bother them IMO, where it is going to jar with high level players.

It is not just the latency that bothers people, it is the larger movement increments. And I do not think there is any difference between Joe Gamer and people like us in the feeling we get from playing the game.
 
I agree, Joe Gamer won't like laggyness, but the control laggyness they get from 30 fps isn't going to be solved switching to 60 fps when it's mostly coming from their TV.
 
I don't understand how people are arguing that 30fps is good enough and yet in the Power vs Price thread they want to as much power as they can get. More power yet same old performance. wasting it on post processing? 8 enemies on screen?

9 enemies on screen!

30fps for life! At least in turn based rpgs the additional fps don't do anything with regards to control.
 
I'm not talking about frame rendering time in the context of perception of movement transition and reaction times of the human eye/brain, but about the relation of distance traveled and speed within the physical simulation of the game. An object moving at high speed and being calculated at lower frame rate will output a distance traveled higher between each frame than when being calculated at a higher frame rate. Lets assume you'r watching the result of an offline rendering project of an object moving horizontally in a straight line and your field of view is static and limited to a 30 meter distance. In that scene the object moves at an initial and constant 30 meters per second and the result saved in an extremely low frame rate of 3 fps, that means it calculated the distance traveled only 3 times (needless to say the monitor refresh rate could be 60 or 120+ wouldn't matter), you would see said object position at meter 10, 20 and 30, each frame capturing a distance of 10 meters. Now what would be the output if the object was moving at 60 meters per second and still calculated/rendered at 3fps? We wouldn't see it traveling said distance 3 times, but only 1 at meter 20, 2nd frame object would be at a distance of 40meters outside of the field of view and 3rd frame would be at a distance of 60 meters. Now suppose after an upgrade it could render at 6fps and the object speed is still 60 meters per second, then the output would show the object again at position 10 at frame 2, 20 at frame 4 and 30 at frame 6. Now extrapolate this to 30fps and 60fps.

edit: Im very sleepy but i think i got the logic straight, or not and i just embarassed myself :p

The distance travel over time at a certain speed is the same regardless of frame rate. At 30 fps you have enough frames where your brain is basically merging all the images together. Your brain isn't measuring your movement in 5 or 10 foot increments at 30 or 60 fps regardless of how the images are technically presented to your eye.

I can take a corner at 30 fps as well as I can at 60 fps.

In fact, unless you spend the vast majority of your time racing on randomly generated tracks, you are racing mostly on tracks that you have built a mental model on how to race that becomes more robust as your experience increases on said tracks. It becomes all second nature.

The strongest case for 60 fps and 30 fps when it comes to racing is when in the close proximity of other competing drivers. Where unlike the track itself they can easily break any prediction model especially when bunched closely together. They require more attention than the track itself. An extra 16 ms may really matter (Im still not convinced) under such circumstance because you are exposed to actions that can't be predicted and are more likely to require immediate action. That includes watching out for collisions or trying to find that moment where overtaking is possible or trying to keep somebody from overtaking you. And in those instances the relative speed between you and any other driver around is no where near 30 meters per seconds unless you on a straight and they are in front of you parked in the middle of the track.
 
Glad you agree shifty :)

;)
I did not fully appreciate that. Is that universal? I've seen enough online games this gen where shots didn't register that I find it hard to believe. Obviously there's net code of sorts ironing out the sync issues (hence some games get praised for their net code), but there's certainly a notable affect of lag on gaming such that changing servers can provide a better experience. I'd like to know what internet lag really adds to real lag across different games, in relation to screen refresh lag.

Pretty much all online games have certain mechanics driven on the client side with just the server adjusting you if you break any rules.

Typically, in the games out there you can atleast move (and in the case of an fps, aim and often also shoot) freely as long as you don't break some rule.. It's not like you want to take one step and the client tells the server to move one step and then you see it on screen.

That just doesn't work without very low latency because it would be a total immersion breaker. (particularly on consoles which often lack dedicated servers)

In any case, to answer your question Internet lag does not necessarily add with the lag you experience from input lag, frame rate etc. While movement, aiming and shooting feels instant with any well designed online game - the Internet lag usually is seen/experienced in other areas ( and this is the reason why you want dedicated servers)

Main examples:

1. People warping because of lag spikes (host or client has poor/unstable upload speed)
2. You get killed all the time as soon as you see the enemy (reason: the enemy has a faster connection to the server, therefore he will registers his shots faster than you.
Reason 2: He will also "see" you faster. A typical example for this is if a person is coming out of a building. If the server/you has a poor connection, your console will not receive info that the person is outside in a timely fashion (the other guy who initiated the movement however sees this immediately on his screen and have more time to shot you)
3. NPCs just dont die! Reason : In coop games against the AI, AI reacts to whatever happens in the 0 latency world of the server. If you have a poor connection the AI has already done many things before you are told of it!

None of these things have any impact on the lag coming from input etc. it certainly can affect the gameplay experience tremendously, but it cannot be stacked on top of input lag etc.

Best example of this I would say is cod online , bf3 and if I remember correctly kz2(didn't play the other). Even if it's laggy (as long as there is no spikes that will warp you in place), your movement is as sharp as in a low latency game, but the impact of Internet lag on the game is of-course tremendous.

In regard to kz2 I seem I recall a discussion on this forum were many claimed it was lag free because they could not notice it. Then, when I made an example with the cloaked sniper, people suddenly spotted it. ( the invisibility will switch off if you hit something. Therefore by looking at how long it took from the decloaking (client driven) to when you got confirmation of your kill (server side) you could tell the lag was quite severe in some cases)
 
As a side note with regards to this framerate discussion, one guy in my university was a pro gamer some 10 years prior I going to school. He actually won the world cup in counterstrike.

Anyways, he told me that on one championship in Korea, Samsung wanted to showcase their new lcd's, and the gaming rigs were all equipped with one. Now these guys are as hardcore as it gets, you think they said yes to 60 fps? Nope! They all refused to play, so Samsung was forced to provide their old CRT(120 fps) screens and then simply put big black paper boxes partially covering the screens, making an impression of it being slim LCDs!
 
The distance travel over time at a certain speed is the same regardless of frame rate. At 30 fps you have enough frames where your brain is basically merging all the images together. Your brain isn't measuring your movement in 5 or 10 foot increments at 30 or 60 fps regardless of how the images are technically presented to your eye.

I can take a corner at 30 fps as well as I can at 60 fps.

No you cannot. If you make a slight mistake 60 fps will tell you that you made the mistake twice as fast and you have more time to correct it.
 
No you cannot. If you make a slight mistake 60 fps will tell you that you made the mistake twice as fast and you have more time to correct it.

The gap that exist between 16 ms and 33 ms is not huge. In fact its quite small.

Furthermore, unless you spend your time driving with your eyes fixated to the road at the very front of car, your eyes will fixate on reference points that eliminate the sense of speed either through scale or relative speed. Its one thing to look to the side and actually see your car actually moving 10 feet a frame. Its another to look down the track and see a reference point move like a few inches a frame. Or a competitor car whose relative speed to yourself is much slower than 10 feet per frame. When cornering your eyes have to fixate on closer reference points but thats compensated for through the reduction in speed thats required.

To put that in perspective, you ever see a jumbo jet stream across the sky? Does it look like its going 400-600 mph per hour? If you render that in game at 30 fps, would your eyes be able to measure its movement 30 feet per individual frame.

Nevermind that racing is probably one of the worst genre when it comes to visual cues for actions that require immediate reaction for some of the most important aspects of the genre like cornering. Try playing GT or Forza with no sound, no rumble or no force feedback and tell me how that works for you.
 
Last edited by a moderator:
No you cannot. If you make a slight mistake 60 fps will tell you that you made the mistake twice as fast and you have more time to correct it.

I don't want to be arrogant. But are you in shuch a position where you game that well that such tiny delay makes a diference to your ability to play? I can understand that affecting the top of the top pro gamers, but for 99% of people, is such tiny improvement worth half the eye candy?
 
You can definitely feel the difference between 60fps and 30 fps shooters or 60fps and 30 fps racers, action games, fighting games etc.

I think the best compromise given next gen performance levels is for games to go for 30 fps with minimal response times, like NFS Hot Pursuit.

That will mean response times are only 25% slower (83 vs 66 ms) than 60 fps titles while still having twice the time to render the frame.
That represents the best trade off.

How exactly did Criterion manage to do this? Is it just optimising each stage of their gameplay and rendering pipeline to get the minimum lag possible at 30 fps?
 
Back
Top