Playable framerates...

Tahir: There is an edit button.

When I first got Quake3: Arena and still had a Voodoo5 5500, I ran it at 1280x1024x16 with 2x FSAA. It played more or less just fine...

...then I turned on the FPS counter. Average 15fps, minimum around 11-12, maximum around 17-19. Actually it was still just fine though; it was very consistent, and hovered at 14-16 almost all the time. :)
 
for me average doesnt matter at all. minimum does. if im in a big firefight and fps dips below 40, i get pissed. thats in a first person shooter, in a rts, i dont mind if my low is 20fps because it really doesnt matter that much.
 
Nebuchadnezzar said:
K.I.L.E.R said:
Anything under 30fps is good because it's really slow.

:oops: :?: :?: :!: :?:

Science has proven that anyone who wants to see more than 10fps is mentally ill. I'm sorry but I don't want to become mentally ill. It's been proven from the Inquirer and I don't want anymore than 9fps.
 
I believe that there was a technical artical posted on bluesnews, oh, about 2 1/2, 3 years ago about visual perception and fps and illison of movment and stuff.

Bascially it said that you need enough FPS to saturate your optic nerve with info therefore giving the perception of motion. The author then went to mention that film and TV could have a lower FPS than a computer monitor. That because the image is burnt on your retina for a couple of milliseconds before the next image is shown (look at a bright light - notice that when you look away you can still 'see' it).
Because of this burning, you get a sorta blend between frames, giving the illision of motion and a high refresh rate. But that a computer monitor doesn't do this (as much) so it needs a higher refresh, and therefore at least a matching FPS.

I mean - who likes looking at a monitor with 25Hz refresh rate? So what would make you think that a game running at 25Fps would be ok?

Keep in mind this is all paraphrased from memory after reading this article 3 years ago.
 
I was kidding above. :LOL:

Actually another theory is that computer monitors make you go retarded after a while and that is why video games are linked to (oh FACK here we go again :rolleyes:) violence.
 
SlmDnk said:
Nice proggie

http://personal.inet.fi/atk/kjh2348fs/fps_compare_v0.2.zip

Synthetic stuff, yes, but quite helpful anyways...

Wow.. someone else than me (the author) actually knew about FPS Compare! :D
The "official" site is at http://sdw.arsware.org/ btw. but no changes has been made since v0.2.
I'm actually currently working on a new version (that looks a little better aswell) and trying to find out good scenes to really visualize the difference in smoothness between different FPS.
 
I find 18 FPS still perfectly playable, and it was my target FPS in software rendering times. The most important thing is that it's constant. There are a few games that run at 100+ FPS when looking into a small room but can dip below 18 FPS when looking down a long corridor while firing a plasma gun. Some games even only display the reciproke of the lowest rendering time for a frame during a second, while it doesn't feel smooth because this is determined by the longest frame rendering time.

8 FPS is another barrier. If it goes below that it isn't responsive any more. Click the fire button when the enemy is in the center of your crosshair and you've missed him. With some effort, you can still consider 8+ as 'playable'. Sometimes I play at university and since these computers don't have hardware acceleration I have to accept this kind of performance and I still find it fun to play Counter-Strike.

Of course, I do perfer 30+ FPS, or even 60 FPS and more, but you asked for my definition of playable...
 
On a side note :

My brother and I played through Quake1 in cooperative network mode, when I had a Pentium90 and he had ha 486-66. I ran at 400x300 res and had 16-17 fps which I though was a-ok. My brother had about 4 fps in 320x200 :D

Nowadays I really prefer games to run at monitor refresh rate (100 or 120Hz depending on res.)
 
And just to clarify some thinks about fps and motion blur. Everything that moves more than 1 pixel between two successive frames can be considered aliasing in the domain of time. Without motion blur, fast moving objects at 30 fps seem choppy because they move more than 1 pixel. At 60fps you don’t see any choppiness for a given movement speed, but objects with more speed than a limit (the Nyquist limit) will look choppy. So you have to go to 120fps or more and so on (So if you wand to play a fast paced action game you need at least 100fps).
This way, you are just raising the Nyquist limit. Always there will be a movement speed that produces aliasing(choppiness).

On the other hand, you can just stay at 30-40 fps and perform motion blur. The fast moving objects will just start to look blurry. You can even stay at 20 fps or lower, but then even slow moving objects will look blurry. Consider that many NTSC dvd’s are at only 23fps and none of them goes beyond 30 fps, so it’s not so bad as many people think

To summarize, quality motion blur (aka antialiasing in the domain of time) is much more important and cheap than high frame rates to archive realism. But the sad think is that Pixar have patented the stochastic sampling method, so it’s unlikely to see it in hardware before the patents start to expire.

http://www.beyond3d.com/forum/viewtopic.php?p=105341&highlight=motion+blur#105341
 
The "insane FPS" craze really took off around the point of Quake2 being released. It didn't take long after the "Quakeworld physics" patches for players to discover that if you could saturate the server (with client network updates being synchronized with client framerate), you could pull off some amazing things.

With a server cap of about 90 updates per second, "pro" players discovered that if they could push their clients (and network connections) to >100fps and >100 updates per second, they could pull off amazingly long strafe jumps, get server mis-sync "kills" and do other acrobatics that the players less than 90 fps could. I got a first-hand demonstration by several pro-players on framerate saturation tricks and I have to admit, it was damn impressive.

Online gaming has progressed now so that framerate isn't as controlling over gameplay and that more and more determinations are made by the server rather than a mis-synced client PC playing the game. Yet the strive for higher FPS goes on as people got used to the ultra fast responsiveness and crisp control of a game engine running at >80 fps. There IS a distinct difference in feel and control, especially in twitch shooter games, when the framerate is even well above what is humanly visible or monitor refresh. What you can't possibly "see", you can more than most definately feel in control responsiveness and engine accuracy. So the quest goes on for this quotient of the gaming community.

As far as "playable fps"- there is no such thing. It all comes down to too many factors as game engines aren't all written the same way. Some games have control/input loops in-sync with framerate, while others have complex buffering/queing schemes. Some games also require more precise visual feedback of controls made (racing games, for example) so any latency in visual feedback can be disastrous in these kinds of games. So even two of the same genre of games can be coded differently and have different effect on control based on framerate.

In general, you will never really be able to determine some sort of 'minimum playable framerate' due to human taste reasons as well as trying to put your hands around a much bigger topic with enough variables to throw off any such determination.
 
Back
Top