Poll: What's your minimum framerate?

When choosing graphics settings, what's your minimum framerate to aim for?

  • Above 60 fps. Anything less than 100 fps is a juddery mess

    Votes: 0 0.0%

  • Total voters
    42

Shifty Geezer

uber-Troll!
Moderator
Legend
The results from the 'what settings do you use' poll contradicts with the the common story about PC gamers preferring framerate to console gamers. As a partner poll, this one asks specifically about framerate choice to see if PC gamers (on B3D ;)) are actually happy with 30 fps or whether 'prioritise quality' actually assumes a higher framerate baseline.
 
My primary monitor has a 60Hz refresh, so it is little use to shoot for better than a solid 60Hz. On the other hand, that was the reason I gamed on a high-end Sony CRT until 2014. Now I also have a 144Hz Freesync LCD monitor, and of course when I use that, it is because I specifically want that higher frame rate. In all honesty though, I rarely play those games. The image quality of the 144Hz monitor being so much worse than the 60Hz one contributes to my settling for the lower frame rate.
 
no "varies, depending on the game"?

twitch shooters and side scrollers (or isometric scrollers) = 60fps minimum
not so twitchy shooters and majority of games = 30 fps minimum

this is because i still can tolerate lag but utterly cant unsee the slideshow of 30fps when the camera pans on elatively flat plane :(
 
It's a very personal question. Some games are totally fine with 30fps, while some games that try to reach 60fps (but don't) would look better with a rock solid 30fps. In my eyes, and with no VFR tech.
 
no "varies, depending on the game"?
That was answered in the other poll. What I'm wanting to correlate is what 'prefer quality over framerate' from the other thread means. When asking this of console gamers, the choice is 30 fps or 60 fps, so 'prefer quality' means 30 fps. But for PC, I don't know if 'prefer quality over framerate' means 30 fps or 60 fps - it could be that 60 fps is considered 'low framerate' and the quality settings for those who 'prefer quality' actually assume a minimum 60 fps instead of being willing to go as low as 30.
 
I've gsync so I aim for consistency in graphical settings. Even when i hit 144 and drop down to 80, I'll feel that, I can't see it, but you'll feel it temporarily. Large fluctuations can be felt despite the high frame rate dropping to still high frame rate. So i'll find ways to keep my frame rate distribution tighter.

On PC vsync or gsync for me however, hard to go back to unlocked frames going all over the place.
 
I started gaming on PC much more on the past years and I find myself preferring solid 60fps no matter the cost to IQ. The only exception are non action games that have a 30fps cap option. If I can't cap, I still try to hit 60 to avoid the varying stutter.
Then there are games like HOB that I recently had that had dips no matter how low the quality was, so I said fuck it and cranked everything to 11 because it made no difference to the shit performance.
 
no "varies, depending on the game"?
Agreed should be an option
something like an adventure game doesnt really matter but a simulation higher = better
I remember on the Descent 2 boards someone claimed you really needed over 200fps (I dont know if they were correct)
but they had all the calculations (ship movement per frame, laser movement per frame ect) and it all seemed to make sense
 
I think a consistent 30 is a bare minimum below that wouldn't be percieved as a smooth animation. Even then 30 isn't great but it's bearable.

What I consider bearable now is different from what i considered bearable as a kid tho. I played probably hundreds of hours at least of star fox which spent most of it's time at like 15 fps. And it was pretty much fine, felt a little jerky but was very playable. Nowadays that would be way too slow.

Same thing with some early 3d pc games. They ran badly on my PC at the time but I had no way to improve it so just kinda beared it. Was still enjoyable.

I guess it depends how spoiled you are. Once you're spoiled on high framerate it's hard/impossible to go baxk
 
Always trying to get solid 60fps (more in multiplayer games such as Battlefront etc.). If ist not possible to hold 60fps I would rather play with locked 50fps without Vsync than playing at 30fps. Even 40fps do feel much better than 30fps.
 
My case doesn't fit any of the options.

I have a Freesync monitor (like many others nowadays) and my minimum framerate is the minimum variable sync refresh rate supported by my monitor, which is 40Hz.

I get no tearing, minimum input latency and no need for constant 60FPS if I want something above 30FPS.
 
Prefer motion resolution of 60 or more.
Wouldn't mind motion interpolation for games like Detroit or other slow paced games to get best of both worlds.
 
60 fps on LCD screens are going to be the new 30fps soon now that we have things like Lossless Scaling and other solutions. I barely play games at less than 165fps anymore, except some AAAs with demanding graphics. On most games 55fps (locked with Rivatuner) + Lossless Scaling 3X and Performance mode enabled does a really good job for me on my 165Hz monitor.

Then there is CRT framerate, which seems to be on a whole different level. According to DF staff, 60Hz on CRT feels muuuuuuch smoother than 60fps on a LCD, and 120Hz CRT is much better than 120Hz LCD, like 120Hz on a CRT feels like 240Hz on a LCD:

 
Voted 30fps average.

I've played video games since the mid 1980's; I don't need 240fps to feel safe and secure. It also helps that I absolutely do not play twitch-shooters.
 
Cant stand 30 fps on my OLED, worked fine on my old plasma.
I played spiderman 2 at 40 frames, and that worked really well, even for a fast paced game like that. It need to be very stable 40 fps. For something like doom I want 60.
 
In my college time I finished quite a lot of games at sub 30fps on PC but I guess that could not be helped with weak h/w that I had and that has given me quite a bit tolerance for slideshow level fps. I usually did not measure fps, but just kept cranking settings until the game became unplayable.

Now that I think about it , playing on PS2 was so pleasant because of the frame rate.

My current laptop has adaptive sync and well I guess that is hiding judder or frame time issues or other timing related stuff
because even though every thing is cranked up I do not see the same kind of frame issues.
 
I just want that CRT feeling back, when you didn't need to instinctively un-focus your eyes each time you took a turn in an FPS game.
100hz on a CRT was from an other world compared to my 240hz oled.
So, the higher, the better for me...

And yes, I prefer the oled in every other aspect.
It's just motion clarity, that is lacking.
 
Back
Top