Playable framerates...

I have a theory about it. I think One's opinion on playable framerates is determined by the hardware they "grew up" with or have used for a long time. For me ~30 fps is playable. But the hardware i use isnt top of the line either.

Also I think that people read the reviews of top of the line cards and see 100+ framerates and thats causes their playable line on their meter to rise. Of course this may have just been me. I did it a few times then realized what i did. :?

Any thought on this?

This is just a theory discussion. No research numbers and stuff please.
 
Thre is no set limit that people find "playable". It depends on the games. FPSs and racing games typically need faster framerates than flight sims for example.
 
I found if I tone down the detail when I play and get faster fps it is easier to do well in games.
 
FPS or Racing games basicly anything that is fast paced I dont like anything below 60fps. But Games that are more slow in nature I find 30 very playable.
 
I think it depends what games you're playing, and what framerate you've experienced. It's not fun playing Counter Strike at 30fps avg when you've tasted 60+.
 
First-person shooters need 60+fps to be "whole" IMO.. especially in competitive play.

To be frank, I don't mind ~30~fps in other genres.. although 60fps is obviously preferred.
 
Yeah, it really depends on what game you're playing. Though it is true that most any game will benefit up to 60 fps, above that depends heavily on what sort of game it is. For example, in a game like Warcraft 3 I simply cannot see benefitting from anything over 60 fps, though any fast-paced FPS will continue to show improvement no matter the framerate.
 
most games/benchmarks report avarage FPS, not minumum FPS or histogram or (average and std.dev)
when average FPS is 45(which is enough) occasionally framerate still drops to 20 ( which is not enough)

some 35..40 would be enough, if it never dropped below it.
anything over that is just overkill.

suggestion:

turn off all those fps counters.
then increase resolution/effects/AA/AF as much as you can until the game starts to feel too slow.

and then turns the FPS counters on.

though this does not work very well with superchips like R300 as those will run any game fast at any settings anyway ;)
 
When I first started playing, I didnt notice bad frame rates..I had a bad ass Diamond Viper 550(tnt)..and I lived in a wold of ignorant bliss(I want to go back)...(sobbing)..
 
60 fps is a bare minimum for fast paced FPS like Quake3, Team Fortress, UT2k3. For more tactically paced FPS like Counterstrike, you can live with a "mostly 60 fps" that occsasionally drops lower.

RPGs and RTS dont need much framerate at all. Warcraft II ran at 15 fps, Starcraft ran at 25 fps, Diablo ran at either 24 or 30 fps, and none of them felt particularly laggy.

So it all depends on the style of game!
 
When you get accustomed to higher speeds, you tend to notice lower speeds. And this usually isn't a good thing, even though the lower speed eventually gets you to the same place.

When I had the original Voodoo, I thought GLQuake at 30fps was good. Now, if I still play GLQuake at 30fps, I'll complain. It goes with experience (of gaming).
 
its all about the minimum fps

if you are say averaging around 30-40fps this means the minimum fps could dip down way below 30fps, causing it to get choppy in places

but, if your average fps is say 100, then the minimum fps may dip down to say 40fps in the worse situation and that is still smooth, not choppy

so the higher average fps you have then the higher minimum fps you will have

now, the other factor is change in fps, we as humans can notice the change in the fps and that annoys us

for example we may be going smooth at 60fps, but then we get in a large scene and it drops instantly to 40fps or 30fps or whatever, it is that instant change in the frame rate that we notice, and call that lag

so what we need is a more constant fps at all times that doesn't change wildely

my thoughts on the fps issue anyways
 
Brent said:
its all about the minimum fps
Not quite. "Average fps" is an abused term to me -- it is not what I would like it to mean. "Minimum" cannot be disputed, of course (even thought "minimum" can change to you... see below). "Average", while not exactly disputable, can vary, depending on its calculation by any one app/game.

if you are say averaging around 30-40fps this means the minimum fps could dip down way below 30fps, causing it to get choppy in places

but, if your average fps is say 100, then the minimum fps may dip down to say 40fps in the worse situation and that is still smooth, not choppy

so the higher average fps you have then the higher minimum fps you will have
"Minimum" is minimum (=lowest) in my books... surely there can't be "higher" minimums regardless of what the "average" is since "average" depends on "minimum"!! I think we need to define how "average" is calculated according to your interpretation of how "average" appears to be arrived at using whatever apps/games you use. Never forget that "maximum" is just as influential in determining "average" as does "minimum".

What you want to say is the exact opposite, with an addition -- the higher minimum fps you have then the higher average fps you will have... but this also depends on the maximum fps you have.

for example we may be going smooth at 60fps, but then we get in a large scene and it drops instantly to 40fps or 30fps or whatever, it is that instant change in the frame rate that we notice, and call that lag
I usualy call that "Wot a sucky level"... but you're entitled to how you want to call it :)

so what we need is a more constant fps at all times that doesn't change wildely
(Incorrect) Understatement Of The Year -- substitute "need" with "want" and I'm alright -- but, really, this comes down to game-engine/level design -- if you can tell me instances where an IHV will tell a developer "C'mon, create a game where there are no maps/levels where we have low/lower-than-normal performance" and I'll give you my system... developers aren't stupid.
 
I used to live with 1 frame per 5 minutes with old Infocom adventures (I was a slow typer). (Just in case anybody points out screen was updating faster, I'd point out you could play by teletype...)

Being spoiled by Amiga, I find anything less than 50 fps (PAL) to be disgraceful. And I mean 50fps constant, never ever for any reason dropping. Actually I regret the lag introduced by double buffering this days as well (Used to use single buffering by chasing the raster).

As you can imagine I never liked triple buffering (stupid idea, the image appears HOW long after I pressed the button!)

So if I'm playing a 2D shooter, I feel the need for at least 100fps to match my near 20 year old machine.... Thats progress for you.

Ignore me I'm feeling nostalgic....

Thats the real reason people prefer playing CS at 100+ FPS, its the reduction in lag. IIRC it should be no more the about 50ms, your brain to finger interface takes about 20ms which leave you about 30ms to update the screen from when the button is pressed.
 
That is something people tend to forget it is not just what the eyes can see and their limits but the whole experience - total immersion, suspension of disbelief, being transported into a virtual reality... damn fingers demand satisfaction too! ;)

I was totally immersed in the first Grand Prix game by Geoff Crammond which at times ran at something like 5 fps and was doing no more than 10 fps on a good day on my ATARI 520 STFM. If I played a game with so low FPS now I would throw it in the bin, no questions asked (or return it to the store I got it from, hehe).

For me as long as everything is smooth and there are no stutters every now and again I am happy, whether it is 25fps, 60fps or 600fps.

The most playable game ever IMHO is a 2D platformer in any case and I have no idea what framerate it ran at cos I didn't care and didnt know what a FPS counter was, (ignorance is bliss).

If only people would review graphics cards by playing games on them first and then benchmarking them afterwards they would realise how stupid those numbers they are always quoting affect their judgement of a perfectly fine product.
 
as long as the minimum stays over 40fps i cannot tell a difference at all, down to 30fps is noticeable but nothing to complain about, once it drops below that i start cutting back on eye candy. oh and i plaly cs mostly on my 9700pro so it is locked at my refreshrate for a constant 85fps however i still feel just fine in games that do much less.
 
Back
Top