*spin* another 60 vs 30 framerate argument

So any game in which the speed at which you can respond to things happening on screen and see the result of your response is going to have an effect on how well you can play a game, is going to benefit from having a higher framerate.

Agree completely.

Why push for full HD rendering only to soil it all up with motion blur.

I'd take 720@60 over 1080@30 any day of the week.

Cheers
 
I disagree with the whole reaction and interception argument. 30 fps is enough to accurately predict where something is going to be. There are YouTube vids of gamers getting id-air sniper shots and the like with incredible accuracy. The only time it's not enough is when a change happens in trajectory, and 60 fps gives an extra 17ms advanced warning. Actually I'll give 60 fps more benefit as it provides potentially 3 frames of motion where 30fps looks static, providing 50ms useful info. But unless you're a top-tier player, it's hardly a game changer.

You are completely wrong. They key aspect to understand about latency in competitive games is that all latency is additive. This includes latency of the controller, latency of the game, latency of the screen, latency of the network, and the largest source of latency of the all, latency of the player's brain.

Think of a very simple game where two players have their finger on a button, and stare at the screen. Once the screen changes color (on order by a remote server), the one who presses the button first wins. Pressing too early is penalized. Now, as soon as the players are roughly equal, it doesn't matter where the main source of latency is. Any latency in any part of the chain instantly and directly hurts the player. 1ms of extra latency in screen drawing still hurts, even if the network latency is on the order of hundreds of ms and the processing latency in the brain ("skill") is even longer. Larger latencies cannot mask smaller latencies, because in the end they are all added together.

Now, most competitive games are not quite so simple, but almost all of them have a twitch element. Reducing latency does give an advantage at that twitch component, even if the reduction is a very small portion of the total latency. Going from 201ms to 200ms gives a measurable boost in ability.
 
If there was 1:1 response to button pressing delayed only by refresh, I doubt many could tell the difference between 1/30th of a second and 1/60th of a second when shooting a gun or jumping. It is there, and you'll notice it if you go looking, but in the midst of a game the difference is minimal. And as I say, in online it's totally immaterial. You could have 1/1000th of a second controller latency, pull the trigger exactly on the enemy's head, but in reality they are a few feet away because the online lag isn't placing them in exactly the right spot in your display. ;) Thus I don't think discussions about 30 vs. 60 should look at responsiveness where framerate is a small part these days. Framerate is an aesthetic.
 
You are completely wrong.
Yes, but you're talking top, top end gamers worrying about that level of latency, and they ahve all sorts of other issues to sort out first before framerate changes are going to make a significant difference. 17 more ms extra frame delay on top of 100 ms display latency is not going make the difference. First things first if that latency is troubling you, switch to a monitor or a gaming TV. And get a gaming ISP service. But target 60 fps in all games when it'll make almost no difference in real terms to those playing doesn't make a great deal of sense. COd is used as an example, but that 60 fps makes very little difference to most players on 100+ ms TVs and 150+ ms internet connection lag. 60 fps just feels better, and maybe gives them a sense of better sensitivity. If I present a hypothetical example like yours, the two players are watching the square travel from the left of the screen, and have to press the button when it rest in the centre of a cross-hair. At 30 fps, the player is still going to be getting enough movement cues (assuming the square isn't moving stupidly fast!) to predict when it'll be in the cross-hairs. With the IO polling 60 times a second, the 60 fps player won't have any advantage. But if the 60 fps player is on a TV with 100 ms lag and the connection is 100 ms lag, versus 50 + 50 for the 30 fps player, the lower framerate is going to provide the more accurate, responsive game.
 
Shifty Geezer said:
Yes, but you're talking top, top end gamers worrying about that level of latency, and they ahve all sorts of other issues to sort out first before framerate changes are going to make a significant difference. 17 more ms extra frame delay on top of 100 ms display latency is not going make the difference. First things first if that latency is troubling you, switch to a monitor or a gaming TV. And get a gaming ISP service. But target 60 fps in all games when it'll make almost no difference in real terms to those playing doesn't make a great deal of sense. COd is used as an example, but that 60 fps makes very little difference to most players on 100+ ms TVs and 150+ ms internet connection lag. 60 fps just feels better, and maybe gives them a sense of better sensitivity. If I present a hypothetical example like yours, the two players are watching the square travel from the left of the screen, and have to press the button when it rest in the centre of a cross-hair. At 30 fps, the player is still going to be getting enough movement cues (assuming the square isn't moving stupidly fast!) to predict when it'll be in the cross-hairs. With the IO polling 60 times a second, the 60 fps player won't have any advantage. But if the 60 fps player is on a TV with 100 ms lag and the connection is 100 ms lag, versus 50 + 50 for the 30 fps player, the lower framerate is going to provide the more accurate, responsive game.

2 things. You don't understand gaming mechanics with respect to lag. The lag you talk about in terms of online does not work at all as you think. More on this later.

1. Higher frame rate gives you more visual clues, making you able to for example adjust aim faster, or predict when that square of yours is hitting the middle. Not like you say just astethics.If a game runs at 60 fps in a shooter, you can more accurately aim (you get visual cues about your aiming twice as fast)

You cannot compare it to game display lag etc, its not the same. If I play a shooter at 60 fps, my brain can adjust and react to information twice as fast, meaning if I see a guy go left, and I'm aiming slightly off center, I can adjust faster. If I aim to much to the left I can readjust that much faster, because the information is coming at twice the rate.

This is regardless of input lag etc, as your always reacting to what you see!

Furthermore, most gaming TVs that are decent have less than 16-30ms lag, meaning that at MOST you will have one frame lag on the input you see on screen. If the game runs at 30 fps, it takes longer time for you to see how your input affected the game, and adjust accordingly. This is irrelevant of being online or not, and increases precision tremendously. It's not aesthetics.

In fact, if it matters so much that if two gamers have the exact same skill, same lag, but one runs at 30 fps the other at 60, the 60 fps guy would win every single time. Why? Because the 60 fps player will see the other guy 16 ms faster than 30 fps guy and start his reaction before 30 fps guy.

2. Further most online games today work in a matter where your aiming and movement is not really bound by Internet lag in the way you think.

For example, in cod, your movement, aiming and shooting is local to a large extent. Even though the server gets your movements and aiming at a x ms delay, your on screen movement is instant. The server will only adjust you if your Internet is incredibly slow, do a move it derms illegal or if the signal breaks.

What this means is that in cod, you always react to what is on YOUR screen at any given time. A simple example is this: assume i have 100 ms lag to the server. disregard any input lag for the purpose of this example.

If your falling down a cliff and i shoot you in the head mid air, and MY LOCAL CONSOLE registers a hit, it will simply tell the server I hit you in the head. Now, if your falling fast enough, based on your thinking , the server, which has 0 online latency would say the shot missed right? because obviously by the time 100 ms pass you would have fallen way more and i would be shooting the air? This is not true. Since the local client says I hit, unless the hit was deemed impossible by the server (some rule about movement etc) it will comply and kill you.

However, if we are both shooting at each other, the rule of the game is whoever tells the server first the other guy was shot in the head, wins.

This is why if you watch a killcam, it will always look slightly different to what you just experienced. The player that killed you, if he was moving, was in a slightly different position, and even though you shot 5 rounds, the killcam only shows 3 (the 3 that reached the server before it got a signal that you were dead!

As such, framerates matter, because of point 1, it also matters because much more than you think because online lag doesn't work in the way you thought.

if you don't believe me, run cod online ( or 99% of any other fps game), if your tv has less than 16/32 ms of input lag, you will see your movement on YOUR screen the very next frame, even if you have 50 ms ping to the server. Or just turn of your wifi mid game, can you still move? Within some bound distance rule - you can!

Similarly, try playing an MMO, and turn off Internet. You can still run around for maybe 10 seconds before you actually disconnect in game. This was a common pvp trick back in the day in mmos, as people would simply have BitTorrent running and turning off upload limit to create a lag spike. Until the next signal to server, for every other player involved you would be looking as running in whatever direction you were heading already, then warping to a completely different one.
 
Last edited by a moderator:
The Knight from Ne & their assorted Shrubbery.

The answer is 24, 48, 72.
Video Sync, what the threshold the game can maintain from there.

{Is it strange that the HDTV Revolution is still based on 60hertz interlaced fields? Why argue when most of the time it is a rendering limit, and not a stylistic decision to reduce the frame rate?/?}
 
{Is it strange that the HDTV Revolution is still based on 60hertz interlaced fields? Why argue when most of the time it is a rendering limit, and not a stylistic decision to reduce the frame rate?/?}

It is a stylistic decision of what to prioritize.
 
Tuna,
Thank you for the polite response. I didn't realize how the post would read. You are right the designers negotiate with the limit or what features they place priority on. ~ By stylistic... I was watching Total Recall and noticed how 24 frames verses 60 frames in football reveals a world of difference in action. 24fps is a strobe effect most people don't even notice unless they go clubbing. So I realized that the 24fps causes the mind to fill in the gaps, and makes actors look much better in action films when they aren't very good at action. ~ That's what I meant by style decision.

If your art or animation is crap, then a lower frame rate would make crap animation look better.
With the glut of detail people feel they need to put into games, I get what you mean by priority.
But Mine-craft is proof you can do crap, and make it enjoyable. Mining the gamer Mind to imagine.
 
Based on the thread 30 fps is a stylistic and a bug reducing limit (to prevent slowdown). All games should run at 60 fps but devs would rather go for a prettier game than a game that runs fast in the hopes that more people will buy it and their company won't get shutdown because of poor sales.

Of course if you don't know any better than 30fps then points don't matter either way since thans the bare minimum.
 
Chez already made a clear(er) point. I'd just like to say to shiffty to try a fast racing game on a pc at both 30 then 60fps. From my experience, 30fps is not accurate enough, specially when objects move/interact at high speeds.

(My limited knowledge cant explain in clear detail, so anyone more technically inclined correct my basic understanding on this.)

At 30fps you should be finding yourself hitting corners or missing perfect trajectories way more often just by a small edge at high/peak in-game speed and this isn't just due to increased latency in rendering. If im not mistaken due to the increased speed of the moving geometry withing the rendering process at 30fps the distance gap of said geometry that moves withing the world is doubled (i should say it moves the same but the perception of it is cut in half so we perceive it as travelling twice the distance) compared to 60. Meaning at 30fps there's not just less time to react, but also distance. As an ASCII'ish rough example, take object A moving along distance 0 to 6 at *insert high speed* and rendered at 30fps, then we would only see distance 0, 3, 6. while if it were moving the same distance rendered at 60fps we would see a more precise distance of 0,1,2,3,4,5,6. At 30fps if you had to brake at distance 2, we would be more at the whim of whether the physics engine was coded to truncate or round the value, so you would either break in time if it truncated the value or if it rounded, we would brake late and crash/miss/whatever.

Anyway i think there's not much to discuss (again) about targeting 30 or 60fps. It's up to the developers to trade off what's best for their game type and its pace, imo 30fps is well acceptable if it's either a slow "hide behind wall" shooter or truck driver simulator. But aiming for 30fps on say a F1 racing game then, that's a very bad decision.
 
Last edited by a moderator:
^ hahaha, so you are willing to sacrifice game performance so you can send it over the internet? lol. Movies! They will justify anything for their cause.
 
^ hahaha, so you are willing to sacrifice game performance so you can send it over the internet? lol. Movies! They will justify anything for their cause.

No, I want a stable frame rate.
If they can make a consistent 60 fps, I'll take that.
If they can only make a consistent 30 fps, I'll take that instead.

Otherwise, I think we're looking at a standard of 1080p/30fps... which, as a primarily PC gamer, isn't great - but if it's stable, it's playable.

But technically, I think 30fps might suit Sony/MS if they intend to do streaming to/from the internet/other devices.
 
No, I want a stable frame rate.
If they can make a consistent 60 fps, I'll take that.
If they can only make a consistent 30 fps, I'll take that instead.

Otherwise, I think we're looking at a standard of 1080p/30fps... which, as a primarily PC gamer, isn't great - but if it's stable, it's playable.

But technically, I think 30fps might suit Sony/MS if they intend to do streaming to/from the internet/other devices.

You say streaming and stable in the same sentence? Good luck with that!
 
Some thoughts:

Comparing Ratchet&Clank Full Frontal Assault/QForce (QF) vs A Crack In Time (ACIT) everything looks much blurrier when everything is moving in QF. This is particularly noticeable when you move the camera around the player. This blurriness makes it much more difficult to see and to focus on what is happening on the screen, especially when things are moving.

I feel that the same thing occurs in the new GOW game as well. Compared to PS3 GOW1 it is much more difficult to see the enemy's attacks.
 
Chez already made a clear(er) point. I'd just like to say to shiffty to try a fast racing game on a pc at both 30 then 60fps. From my experience, 30fps is not accurate enough, specially when objects move/interact at high speeds.

(My limited knowledge cant explain in clear detail, so anyone more technically inclined correct my basic understanding on this.)

At 30fps you should be finding yourself hitting corners or missing perfect trajectories way more often just by a small edge at high/peak in-game speed and this isn't just due to increased latency in rendering. If im not mistaken due to the increased speed of the moving geometry withing the rendering process at 30fps the distance gap of said geometry that moves withing the world is doubled (i should say it moves the same but the perception of it is cut in half so we perceive it as travelling twice the distance) compared to 60. Meaning at 30fps there's not just less time to react, but also distance. As an ASCII'ish rough example, take object A moving along distance 0 to 6 at *insert high speed* and rendered at 30fps, then we would only see distance 0, 3, 6. while if it were moving the same distance rendered at 60fps we would see a more precise distance of 0,1,2,3,4,5,6. At 30fps if you had to brake at distance 2, we would be more at the whim of whether the physics engine was coded to truncate or round the value, so you would either break in time if it truncated the value or if it rounded, we would brake late and crash/miss/whatever.

Anyway i think there's not much to discuss (again) about targeting 30 or 60fps. It's up to the developers to trade off what's best for their game type and its pace, imo 30fps is well acceptable if it's either a slow "hide behind wall" shooter or truck driver simulator. But aiming for 30fps on say a F1 racing game then, that's a very bad decision.

Wow. You must have a poor opinion of a human's ability to track movement and predict its own position in space. 60 fps isn't going to help you take corners "way" better than 30 fps. Your brain is well ahead of anything being drawn on screen as its constantly predicting object movements and determining their future positions. The frames are providing visual feedback and the more the merrier (in terms of error correction) but the frame rate threshold needed to go from being a bad driver to a good driver doesn't sit between 30 and 60 fps.

Your brain can't perceive individual frames at anything higher than 10-15 fps, so at 30fps your brain isn't interpreting the distance travel between 0 and 6 as 0, 3, 6. Its perceiving a smooth transition between 0 and 6 with higher frame rates providing a smoother transitions.

And if the difference of making a turn or crashing is whether you brake at distance 2 (success) or distance 3 (failure) then you probably braking way too late anyway, because I highly doubt that racing video games routinely create situations where your margin for error is a window of 16 ms. Thats probably on the driver not the game or its framerate.
 
I wonder how developers will test next-gen consoles on resolution and frame rate?

What I mean is do they start on the low-end with the PS3/360 standards and then work their way up to what a PC with moderate specs can do, or is it vice-versa?

It seems what most console developers do with the start of every console generation is to make games of the previous consoles' standards (same models/physics/textures, but with better resolution and frame rate), and then once they once move past those previous standards they end up sacrificing the resolution and frame rate to attain better results in those other areas.

What I'm trying to say is are there any developers who start on new consoles with an actual idea of knowing what they'll be aiming for later in that console generation, without actually sacrificing much of anything later-on to do so?
 
Back
Top