Why go for 60 FPS in every bit of every game? *spawn*

Higher FPS reduces blur and greatly enhances picture clarity.
It literally makes every game better, to varying degrees.
Also you must hate VR, OP.

60Hz should be the industry standard, even that isn't enough, I want 120Hz for every game but I know that won't happen until it becomes the standard refresh rate for TV's.
 
I used to play Quake3 competitively for not particularly good reasons. I kept around a high-end Sony CRT in order to be able to play at 125fps or higher, in conjunction with high rate mouse polling for more than a decade after I quit. Playing on a 60Hz LCD was just...bad. Really bad.
Nowadays I don't play games much, and I'm more likely to go VR than skill based gaming in the future, but low framerates is still unacceptable to me in any game where I can control the "camera".

Different games have different demands. Slow camera movement, timer based (or turn based :)) game play and so on all reduce the importance of low latency input and graphical feedback. Movies and 30fps console games work because they are made to work within the limitations of the medium. Also, peoples experiences shape their expectations - if you haven't adapted to low latency gaming, or even experienced it, you are obviously less likely to miss it. Which is the case for all TV bound gamers for instance.

One of the central benefits of gaming on a PC, for me, is that I can always fiddle with settings to hit my performance targets.
 
Fast sideways moving objects and/or sideways moving camera are problematic for 24/30 fps. You can clearly see the judder when you follow a fast sideways moving object (or scenery) with your eyes. Heavy motion blur helps a bit, but doesn't solve the problem. Movie directors know this and plan their shots accordingly. In games we don't have similar control over the movement of the camera and the important objects. Judder causes headache. You might not notice it right away, but when you spend hours straight playing a 30 fps game, many of us feel the difference. People are different in this regard. I am personally very sensitive for this.

This problem is even more pronounced in stereo 3d (VR or not). When I first saw the Avatar movie, I immediately noticed that 24 fps was not good enough for stereo 3d. There were several scenes with fast sideways movement. The judder was unbearable. It was impossible to focus your eyes to the flying creatures at all. I am glad that VR makers understood this. The standard frame rate become 90 fps (+ PSVR does 120 Hz timewarp).

There are many game genres where 30 fps just doesn't work at all. All 2.5d games (side scrollers) benefit heavily from 60 fps as the movement is mostly sideways. This includes platformers and fighting games. Just imagine playing Street Fighter or Tekken at 30 fps :(

I don't understand the point of 30 fps single player campaign vs 60 fps multiplayer in first person shooters. Why would I prefer worse aim precision and worse reaction time in single player mode? PC gamers are buying 120/144 Hz monitors solely to play first person shooters. It certainly makes the game feel better. This proves that even 60 fps is not enough for this genre. I understand 30 fps for slow paced 3rd person games such as Assassin's Creed or Uncharted, but 30 fps is just a bad trade-off for first person shooters.

2x render time (60 fps -> 30 fps) is not going to bring incredibly better visuals. Next gen consoles tend to bring 8x-10x higher performance. 2x is just 25% of a generation jump. 30 fps is not going to get you anywhere close to next gen visuals. Halving the frame rate hurts the game play much more than the slight improvement in visuals.

It is also worth noting that the recent advances in temporal reprojection techniques favor 60 fps over 30 fps. Higher frame rate makes temporal reprojection work better, as the difference between two frames is smaller. Reprojected data is more correct, resulting in less visual glitches and higher quality. This will result in better quality antialiasing, reflections, AO, etc for the same cost. Or alternatively, you can reach the same quality with reduced cost. As these techniques (and other temporal reuse techniques) advance further, the cost of 60 fps rendering draws closer and closer to 30 fps.
 
Last edited:
I don't understand the point of 30 fps single player campaign vs 60 fps multiplayer in first person shooters. Why would I prefer worse aim precision and worse reaction time in single player mode? PC gamers are buying 120/144 Hz monitors solely to play first person shooters. It certainly makes the game feel better. This proves that even 60 fps is not enough for this genre. I understand 30 fps for slow paced 3rd person games such as Assassin's Creed or Uncharted, but 30 fps is just a bad trade-off for first person shooters.

Even 3rd person games where the user controls the camera can be anywhere from annoying to unbearable at 30 FPS. It may not be as much of a problem for console players where camera motion is more limited, but on a PC where you can control the camera with a mouse, moving the camera back and forth to track enemies, scan the environment, etc. produces lots of unpleasant visual artifacts (judder) from 30 FPS rendering not to mention the increased input lag that makes you feel like you're moving the camera through pudding.

But I guess as some people have mentioned, if all you are used to is most games being 30 FPS or lower, then that is what feels normal because you expect those rendering anomalies to exist.

It's similar to how many people still think 24 FPS film or 30 FPS video is superior to 48/60 FPS film or video when it just isn't. It's just that people expect their movies and shows to have judder. And if it isn't there, then they think something is wrong.

Also similar to how many people felt aliased rendering in games was just as good as anti-aliased rendering in games, until they got exposed to a greater prevalence of decent AA in games. Or AF or any number of things that improved the overall experience compared to what they were used to.

Regards,
SB
 
Last edited:
Fast sideways moving objects and/or sideways moving camera are problematic for 24/30 fps. You can clearly see the judder when you follow a fast sideways moving object (or scenery) with your eyes. Heavy motion blur helps a bit, but doesn't solve the problem. Movie directors know this and plan their shots accordingly. In games we don't have similar control over the movement of the camera and the important objects. Judder causes headache. You might not notice it right away, but when you spend hours straight playing a 30 fps game, many of us feel the difference. People are different in this regard. I am personally very sensitive for this.

This kind of illustrates an example where the FPS tradeoff made sense - the battles in the PS1 Final Fantasy games. They were at an abysmally low 15 FPS, but because they included carefully managed and indirect content they didn't look too bad. At least not to me. I think it was worth the increase in stuff they could draw on the screen, particularly with the fancier animations which had a lot of blended primitives.
 
This
fpsdemo1.gif
 
SS44.gif


You forgot the motion blur.

And that's why I almost always turn off artificial motion blur in games it looks nothing like the natural blurring of objects in motion in the real world. IE - if I track the moving object with my eyes, there should be no motion blur on that object. That isn't possible in games as the game has no way of knowing what you as the player are looking at, so it all just ends up looking very VERY wrong. And in my case actually hurts my eyes trying to focus on something it should be able to focus on, but which motion blur in games makes it impossible to focus on.

That 60FPS text should be perfectly clear when tracking it with your eyes, but due to the nature of artificial motion blur, that isn't possible. Thus my eyes end up hurting as it's trying to focus on it, but it isn't coming into focus as my eyes/brain expect it to so it ends up constantly straining to bring into focus something that can't be brought into focus.

Regards,
SB
 
why the hell was the 60fps one the most blurred one? You did it wrong.

Because it was just a joke.
I took his gif, decompiled it, replaced the original frames with 1+2, 2+3, 3+4 averaged frames, and slapped it back together into a gif. It took a couple of minutes.
Should I have spent much longer trying to create a modern style implementation of motion blur for that graphic?
 
It's just that it makes it seem that faster framerates increase MB, when in reality the opposite is true. It also makes it seem that even at 60fps moderate motion would leed to long blurs, when in reality, the 60fps type would have been close to non blurred or not blurred at all at its (near one pixel/frame) speed.
 
SgZL.gif


Better?

But even that shouldn't be viewed as an accurate comparison of motion blur at different framerates. Just like the original gif wasn't actually 60hz, 30hz, and 15hz. Animated gifs can't even display at 60 frames per second. And that's just one gif which is moving at a constant fps (I didn't check his original delay time). Also, not all browsers can display animated gifs at their fastest speed, so it's not a certainty that everyone is seeing the same thing. This new one I made is with the minimum frame delay, so its fastest display speed would be 50fps. Again, though, all three "words" are being displayed at that same framerate. I'll let someone else make three separate gifs with different frame delays and the corresponding trailing bleed that goes along with them.
 
This kind of illustrates an example where the FPS tradeoff made sense - the battles in the PS1 Final Fantasy games. They were at an abysmally low 15 FPS, but because they included carefully managed and indirect content they didn't look too bad. At least not to me. I think it was worth the increase in stuff they could draw on the screen, particularly with the fancier animations which had a lot of blended primitives.
Worth every millisecond.

d82.gif
 
Whatever, I get the point. You just made it for shits and giggles. I just though some clarification was welcome since we have enough MB haters already, and quite a few ill informed ones every once in a while, that could get the wrong idea out of this.
 

Games built around 30Hz won't scale to 60Hz due to CPU loads, etc, and we don't get 60Hz this round in the general case. Instead games will generally offer some amount of extra super-sampling for 1080p on upgrade consoles.

Hence a significant CPU boost would help with realizing the dream of 60 Hz for more titles. So PS4 Pro won't help with that, but Project Scorpio might. Even with that, however, I expect to see some 30 Hz titles on PS4 to opt for 60 Hz at 1080p on PS4 Pro rather than attempting 4K. This is assuming that they weren't already CPU limited to some extent in many cases. Fallout 4 having to reduce some graphical effects on PS4 due to CPU limitations for example.

I expect that Sony will push their internal studios to attempt 4K, however, as Lottes suggests. It makes sense as Sony would like to boost sales of UHD TVs which has the potential to increase revenue generation for another division of Sony.

Regards,
SB
 
Hence a significant CPU boost would help with realizing the dream of 60 Hz for more titles. So PS4 Pro won't help with that, but Project Scorpio might. Even with that, however, I expect to see some 30 Hz titles on PS4 to opt for 60 Hz at 1080p on PS4 Pro rather than attempting 4K. This is assuming that they weren't already CPU limited to some extent in many cases. Fallout 4 having to reduce some graphical effects on PS4 due to CPU limitations for example.

I expect that Sony will push their internal studios to attempt 4K, however, as Lottes suggests. It makes sense as Sony would like to boost sales of UHD TVs which has the potential to increase revenue generation for another division of Sony.

Regards,
SB

I would rather 30hz titles on the PS4 go for better graphics than 60FPS.

Most games do not need 60FPS. Drive Club even proves that racing games don't require 60FPS. It may help with some racing games, but it is not a necessity.

For example, I'd like to see "Beyond Two Souls" by Quantic Dream remastered for the PS4 Pro at 1080P and 30FPS. With all that extra power, they could make the game look even better. Or a good example would be the Crysis games on PS4. They could remaster the games so they could have graphics that would at least equal the very high setting on PC.

If I had a PS4 Pro, I'd avoid any game that didn't allow me a 1080P and 30FPS option that had better graphics and not just downsampled from a higher resolution.
 
There are the movies for you.
Unless you care how a game plays and feels, you don't need 60 fps and beyond.
If you only care how a game looks, movies are a better entertainment option. You loose the interactivity but for 24 fps and hyper realistic graphics, I'm sure that's a fair trade.
 
Back
Top