30fps vs 60fps - is there a tech demo out there to persuade the ignorant?

I'm so, so, so very tired of seeing people on forums say "well of course the eye can only see at 24/30 frames per second", claiming it as an absolute truth.

It'd be nice if I could throw them a little tech demo which shows a 3D image, and where the user can select the framerate to play it at, to demonstrate how much smoother 60fps* is than 30.

(I actually remember seeing one, a spinning cube, but I'm damned if I can find it... I think it was made by 3dfx)

Has anyone got such a thing? Or knows of a very simple language I could learn to write it in?


(* or indeed higher, if they're on a spangly LCD or a CRT)

cheers,
 
I'm so, so, so very tired of seeing people on forums say "well of course the eye can only see at 24/30 frames per second", claiming it as an absolute truth.

It'd be nice if I could throw them a little tech demo which shows a 3D image, and where the user can select the framerate to play it at, to demonstrate how much smoother 60fps* is than 30.

(I actually remember seeing one, a spinning cube, but I'm damned if I can find it... I think it was made by 3dfx)

Has anyone got such a thing? Or knows of a very simple language I could learn to write it in?


(* or indeed higher, if they're on a spangly LCD or a CRT)

cheers,

That would not prove the point though. If you move something in 3D at the same rate of rad/s then you have twice as many movements at 60hz than 30hz. Even if the eye could only see 30hz nothing says it is going to line up precisely with the timing of the display or even be consistent and therefore it will seem smoother if there is a smaller change in rad between the screens the eye perceives.
 
Those people aren't even presenting the right argument. It's supposed to be that 30fps is completely satisfactory for movies, so it's enough for games.

In fact, when you see a 60fps video, it looks very amateurish, like it was shot with a consumer level camcorder without progressive frame capture. For a cinematic feel you need 30 fps or 24 fps. Our minds are trained that way.

As for me, the main reason I prefer 60 fps is that I'm still satisfied when the framerate dips.
 
I don't think there's much to discuss.

If you can't see that a television looks a bit choppy you need your eyes(brain?) checked. Ergo 30 FPS is not enough, not even with 'motion blur'.

Since in the limit of infinite framerate there will appear to be motion blur and since it's very easy to tell if a game is using motion blur or not regardless how high the framerate is I hold it for self-evident that you can tell the difference between 160 FPS(highest my CRT will go at low resolutions) and many hundred.
 
Thanks for the replies guys...

Sxotty, I can only say that the 3dfx demo worked very well at showing 60fps being smoother than 30fps... so I'm guessing any problems with timing against the screen refresh are negligible.

Mintmaster - an interesting point about cinema. As I understand it the way movies are projected in cinemas has a lot to do with our preference for watching film. At some point I'm going to investigate just what it is about projection that makes it so different from other framerate-limited situations. However, I'm trying to Keep It Simple, and stick with LCD's, Monitors, and at a stretch TV's, as that one is easy to rebutt....

Soylent - you mention TV's. Actually TV's image frames are shown twice, with an intervening black frame... the eye and the brain don't mind that, and the end result is 50fps/hz (PAL) or 60fps/hz (NTSC) and hence less visible flicker. And then the intentionally high latency of the phosphor on CRT TV's also tends to retain image between frames, which further blurs them together. (Remind people how the bat and ball on a Pong video game always left big white streaks across the screen to demonstrate just how long the phosphor is working)

Which leads us to 100hz televisions too... which many AV buffs claim they make watching movies less 'cinematic' ... because movement on screen becomes too smooth ;)



The alternative to finding a tech demo where I can control the framerates is of course to get the person to run an old Half Life engined' game, like Counter-Strike, which will run on most systems at > 60fps, and then get them to use 'fps_max' in console to limit the framerate to lower numbers. At which point they'll have to concede there's a difference and ergo, q.e.d, quo vadis, veni vidi vici, orbis non sufficit... the human eye can see more than 30 frames per sec ;)
But that's a long-winded way of doing it - a tiny tech demo viewable in a window would be easier.
 
Last edited by a moderator:
In fact, when you see a 60fps video, it looks very amateurish, like it was shot with a consumer level camcorder without progressive frame capture. For a cinematic feel you need 30 fps or 24 fps. Our minds are trained that way.
IMAX and Omnimax frame rates are typically higher (up to 60Hz) and you could hardly argue they aren't "cinematic"
 
I decided to test your theory in call of duty (cos I remember the commands for it)

and there is a difference, but its very small infact i recon if you changed someones max fps from 60 to 30 without telling them they would never know
 
I decided to test your theory in call of duty (cos I remember the commands for it)

and there is a difference, but its very small infact i recon if you changed someones max fps from 60 to 30 without telling them they would never know

Certainly 30fps is playable.

I had a very protracted discussion about framerates with someone about this... there are a lot of factors at play... for instance an old LCD screen with a high latency (16-25ms) will go some way to blur successive frames together and actually make a low framerate game look better than if it was run on a modern 2ms LCD monitor screen.

The point was made that many games are locked at a maximum of 30fps, including FPS games like Halo 2. Some games seem to look better at low framerates than others. Consider Crysis... people play that at 20fps... but maybe because there's so much on-screen it has some effect on the player so he becomes less sensitive to the stutters (perhaps it's just because he knows he's going to get juddery views, and therefore plays at a slower pace, tending to make smoother movements with the mouse)

I'd contend you wouldn't want to play at much less than 30fps in most first-person games though, especially not in a fast-paced multi-player match, where fast panning is required for quick reaction shots.

However they key point I'd like to be able to address first is that the eye CAN see above 30fps. (The rest is mostly personal preference and opinion)
Gotta kick that one to death first before worrying about anything else :)
 
Last edited by a moderator:
For games it's all about the visual input your brain needs. If you move an object quickly across the screen from left to right in 0,25 seconds, then 30fps gives you that object at 7 locations on your screen before it's gone. It may well be enough for your brain to interpolate vertical movement, but in this case it's very likely that 60fps and 15 locations on screen before it's gone is going to look a lot smoother and pleasant, especially on a widescreen large enough to allow peripheral vision (more sensitive to movement) to come into play.

A lot of games though do not have this kind of movement. But in for instance first person shooters, particularly projectiles and fragments can make 60fps a lot more pleasant to watch. Racing games are similar especially at high speeds, but even more for quick turning (which creates a lot of left-to-right movement). So 60fps is more suitable for one game than for another, and you have to weigh the odds. The illusion of vertical movement can for instance also be just as effective when using blur at 30fps as not using blur at 60fps, etc.
 
When racing games are concerned 60 fps will always be better than 30 fps. There's just no way around that. Our eyes are much less forgiving for racing games when it comes to frames per second when compared to other genres, except for fighting games. There was an internal study done years ago at SEGA about games and frame rates. This was done back in the Dreamcast days if I recall correctly maybe in October of 99. The study was taking a game and switching it from 30 frames per second and then 60 frames per second, and then back to 30 then 60 and so forth. The study showed in racing games that when going fro 30 fps to 60 fps the player would generally have a faster lap. It would reverse going back from 60 to 30. Some tests started out at 60 fps then went to 30 fps to make sure it just wasn't the player getting to know the track very well. Regardless, the track times on average were lower for the 60 fps than they were for the 30 fps. Similar thing with fighting games. It is shown in these games that when there is very fast movement the brain can function better with the more information it is receiving in the way of motion.

Hope this adds something for ya.
 
Thanks for the replies guys :)

I've been googling around for 3D programming tools, looks like there's some stuff knocking around that might fit the bill for me to write my own demo.

Sonic - thanks, I'll take a look for those Sega studies, they'd definitely be interesting as the crux of my own personal argument is that although I can play games at low framerates, I feel I play much better at high ones. I play Day Of Defeat: Source, which is quite fast-paced (certainly the way I play it ;)) and especially when using rifles I *need* smooth framerates. A proper study from a respected developer would be excellent back-up to that argument.

Arwin - thanks for mentioning widescreen, and the number of images that will be displayed when panning... I noticed that the same framerate on my desktop PC feels worse than on my laptop - I guess that's a result of the bigger horizontal resolution (and faster LCD panel response) of the bigger screen.

Zaphod - LOL, that'd do it :D I hate the fact some of my work colleagues still have CRT's and leave them on 60hz... whenever I see them in the corner of my eye it's ... just... horrible :( I've gone and sneakily changed a few to 75hz ;) I guess it shows some people are just not sensitive to framerates / hz.
 
Unfortunately those studies were never made available to the public. It was purely for internal reasons to showcase why there is definitely a preference for 60 fps.

I think movies and what not can get away with lower framerates because the amount of information in each frame is not necessarily just a still image but contains motion information or something. There was an article either on Ars Technica or here years ago that helps explain this. See if I can find it.

Here it is.

http://www.beyond3d.com/content/articles/66/
 
A long time ago my friend and I were playing around with MAME and Space Harrier, we were toggling between 30Hz, 60Hz and 120Hz. At the end of the day 120Hz is just so much better. Though it was also twice as fast as 60Hz.

I also did a blind test with a bunch of my friends using a 3D Space Harrier like demo (really simple) I put together, they all picked 120Hz as prefer demo. There weren’t enough samples to form a conclusion but it was interesting. Too bad I don't have the demo anymore. It was on one of them SGI machine at uni a long time ago.

I think you can try using Quake3 or other older games with CRT monitor capable of hitting high refreshes. Just play it with different rate.

Back then, I thought by today 120Hz would standard for games. Now consoles are stuck with sub 30Hz games, some are so choppy (eg GTA4) I don't know how people can play them. My theory with frame rate is that if you've never seen better (always play choppy (sub) 30Hz games), it's not going to matter much. I think it's a luxury not a necessity for most people.
 
People need to take a signal processing class. I don't know why this stupid fallacy comes up so often.

I played Tribes 1 a lot back in the day (a game with a lot of fast moving objects across the screen) and needless to say my individual performance dramatically improved when I went from a steady 60 fps to something more like 130. Like night and day difference. It went from instinctual guesswork to pure consistent play.

It also was ten times easier on my eyes and my head. An hour of 60 fps gameplay in that game feels like you just went to a rave and is highly fatiguing.
 
Fred, I can imagine that being the case.

Tell you what, sure can't play WoW anything below 60 fps.
 
Back
Top