30 fps is the 480p of frame rate. It's the absolute bare minimum. Once you go below that you start having serious problems. 30fps games require a ton of post-processing motion blur to hide judder when moving the camera quickly. So you have a nice 4k presentation and then blur the whole thing every time the camera moves. At that point you don't need to worry about the MPRT of your display since you've intentionally ruined the entire image anyway. Motion blur can look quite nice at high frame rates because it can be very subtle to actually hide judder at 120fps. In-game motion blur or not, 30fps games just do not look good in motion. They look great when the camera is still or moving very slowly. The faster the camera moves the more the image breaks down. It's just how your eyes respond to sample and hold displays. You can't even use blur reduction effectively at 30fps. You need to get above 100 fps to start having effective blur reduction on displays with BFI or strobing, though those techniques have their own issues.
A lot of gamers today have never gamed on a CRT, or have never owned an actual fast display that has g2g times to handle 120Hz or higher. Their view of 60fps is from a VA panel or an older tv with slow g2g that might blur everything with ghosting anyway. People are also used to gaming on displays that might have 100ms of display lag, so the difference between 1 frame of input delay int the game engine would be 130ms for 30fps vs 116ms for 60ms. Might be hard to tell the difference. But then someone with a monitor that can have <10ms of display lag at 120Hz+ switches their monitor over to a 30 fps game on a 60Hz signal and they've gone from maybe 18ms of lag total to 50ms and the change feels really really bad.
I have my console hooked up to a gaming monitor with really fast transition times and judder is very apparent even at 60fps. 60fps looks pretty much bad without motion blur. 30fps is just terrible. I guess it's a case of people not knowing what they're missing. Ignorance is bliss. Enjoy 30fps.
Edit: Just going to leave this here https://www.testufo.com/
Try playing with the speed. Even at 120 pixels per second, 60fps is noticeably sharper than 30fps. At 240 pixels per second, 60fps is much better. And that's a fairly slow panning speed.
A lot of gamers today have never gamed on a CRT, or have never owned an actual fast display that has g2g times to handle 120Hz or higher. Their view of 60fps is from a VA panel or an older tv with slow g2g that might blur everything with ghosting anyway. People are also used to gaming on displays that might have 100ms of display lag, so the difference between 1 frame of input delay int the game engine would be 130ms for 30fps vs 116ms for 60ms. Might be hard to tell the difference. But then someone with a monitor that can have <10ms of display lag at 120Hz+ switches their monitor over to a 30 fps game on a 60Hz signal and they've gone from maybe 18ms of lag total to 50ms and the change feels really really bad.
I have my console hooked up to a gaming monitor with really fast transition times and judder is very apparent even at 60fps. 60fps looks pretty much bad without motion blur. 30fps is just terrible. I guess it's a case of people not knowing what they're missing. Ignorance is bliss. Enjoy 30fps.
Edit: Just going to leave this here https://www.testufo.com/
Try playing with the speed. Even at 120 pixels per second, 60fps is noticeably sharper than 30fps. At 240 pixels per second, 60fps is much better. And that's a fairly slow panning speed.
Last edited: