Yes, but how many FPS do we actually see? The Nth iteration of an always fresh debate

True , however I didn't include this test to imply that the eye can distinguish a frame rate count of 220 fps , it is just another evidence that points toward the capability of the eye to see high fps , be it 60 , 100 , 120 .. etc .

It really doesn't.

First you need to understand how eyes see images. They collect light, and the "signal" they send to the brain is continuous, and always takes into account the past x ms of time. This in comparison to how videocameras work, where the signal is discrete, and takes into account a certain timeslice of collected light (so no event can be seen in two signals).

This way, a bright signal that lasts for a very short duration will be seen for that whole x milliseconds, and is in effect the same as a much dimmer signal that lasts longer. This is how CRT screens work -- if you have ever taken a photo of one with a short exposure, you know that only a very small portion of the screen is lit (and only a single pixel is lit brightly) at a time. But as that travels the entire surface of the screen in a time smaller than that x, it looks like the whole screen is lit.

So, if the signal is bright and distinct enough, it will be seen, completely regardless of how short it is. This has nothing to do with how quick changes your eyes can see, or how much fps you need.

As for that, your brain really cannot process much more than a few tens of fps worth. But, it is good in noticing irregularities, and pays attention to them. So how much fps you need depends very much on how well you can fake that the screen is changing like the real world. That's why movies at 24 fps and good motion blur can look better than a desktop with 60 fps and no blur at all.

So how much fps do first person games need? Probably more than 120 if you want to do it naively. But if you spend time doing psychovisual optimizations, you might well beat that 120fps in smoothness with 30.
 
Indeed, 24 fps is fine for theater movies.
There's a huge difference in video shot at 24 fps and video shot at 60 fps. This issue has been discussed to death in movie hobbyist circles.

24 fps film quality is acceptable, not fine. The frame rate wasn't originally chosen because it provided perfect quality. It provided acceptable quality for the cost (film was very expensive). One of my friends is directing movies, and he always says 24 fps feels more "movie like" (matching the frame rate of original film cameras). For him video shot at 60 fps feels too smooth and realistic, too much like all the real life videos shot by hobbyists with cheap 60 fps camcorders. Directors do not want films to look like real world, it's an artistic choice, not a technical one.
 
James Cameron actually thinks that it's time for the industry to move beyond 24 FPS, especially after the "3D" revolution since the success of Avatar. But you're right -- many directors would prefer to stick with the classic cinematography in this regard, for pure artistic reasons. Even not considering the 3D craze, I think higher frame-rate on the big screen in the theater could be a good bonus, and not so on the small TV screen at home. Screen size and field coverage does matter for the perception here, me thinks.
 
It all depends on what you're viewing and whether or not proper motion blur is applied. But I will say that 30fps is just fine for almost all types of games if there is perfect full scene motion blur.

Wrong.

For games, (or virtual reality simulations in general), motion blur does little good.
The core issue is that there is no way to determine just what should be blurred, because you do not know what the player is paying attention to.

Example. Imagine a scene where I'm looking across the street. I'm not rotating the viewport, there is no global angular velocity going on, so it would seem straightforward to just apply motion blur to the cars passing by. However, a good looking girl comes bicycling in from the left and grabs my attention so I follow her progress across my view. The game cannot know that it isn't supposed to apply motion blur to her, but now rather her background. Nor can it know that I'm actually into boys and is instead following the guy walking in from the right....
Games are interactive - the game doesn't know where I look (DOF), or what motion I'm following. Motion blur and DOF are photographic artifacts arbitrarily carried over from cinematography into games, where they do not make sense and interfere with player immersion.
 
There's a huge difference in video shot at 24 fps and video shot at 60 fps. This issue has been discussed to death in movie hobbyist circles.

24 fps film quality is acceptable, not fine. The frame rate wasn't originally chosen because it provided perfect quality. It provided acceptable quality for the cost (film was very expensive). One of my friends is directing movies, and he always says 24 fps feels more "movie like" (matching the frame rate of original film cameras). For him video shot at 60 fps feels too smooth and realistic, too much like all the real life videos shot by hobbyists with cheap 60 fps camcorders. Directors do not want films to look like real world, it's an artistic choice, not a technical one.
You could also call it being averse to change. Or, as you alluded, since higher frame rates are indeed connected to modern consumer equipment, simple snobbery.

Regardless, just like people in movies are no longer greyscale or walk really really quickly, 24 fps will eventually also be a thing of the past. Good riddance. It will take a long time though. I may not see it completely gone in my life time.
 
This thread in serious need of moderation. I'm sure there is enough here to break out all the fps talk into it's own thread.
 
Stop trolling. Moderation here exists for a major purpose, to reduce the noise and to keep threads on track. Why? Because I come here as well as many to read information about many aspects of PC hardware/software etc and having to sift through tons of posts on something that has nothing to do with the thread in question is frustrating.
 
Back when I worked at a helpdesk, I spent some time on this issue also, as I found it interesting that I could always walk around in the office and instantly spot when someone's CRT was running at 60fps or not. I would always change settings for secretaries to 85hz, and almost all of them noticed they got less tired, and some lost the end of the day headaches. When I wanted to explain to them what was going on, I did the finger test as well.

It came to a point where labor safety laws imposed a 72hz minimum, though not many offices actually had people at the time who even knew what this meant. I could see it straight on, but if you want to be precise you just look from your peripheral vision and it's clear as day - your peripheral vision is way more sensitve to movement, and your focus is way more interested in detail, and is also being actively interpolating information that is not necessarily even there. But take corners in racing games which move the horizon very dramatically, and 30fps vs 60fps stands out plenty.

Now however with LCDs, the headache part has changed, because the frames per second don't actually matter in that respect - there is no constant relighting of the pixel, but a smooth transition.

A large cinematic picture moving at 24fps will stand out a lot because it covers enough of your vision to get noticed by your peripheral vision, so that large panoramic moving scenes with a lot of detail stand out even more. On the other hand, scenes where your focus is drawn to a specific moving subject will stand out much less.

So while I agree 30fps can still look pretty smooth with object motion blur (I've seen enough examples now to be convinced) and sometimes even without, 60fps with object motion blur is still absolutely noticeably more smooth (thanks to Tekken 6 for pointing that out).

Personally I think 60fps per eye would be great, but 60fps for both is still very smooth and a huge improvement over what we often have now. LCD displays in that regard though are no longer comparable to CRTs back in the day, so it really is all about the movement now and no longer about the flickering (though active shutter 3D's intermediate frame dimming complicates matters again).

You are wrong.

It's easy to see the difference between 120 Hz and 60 Hz refresh rate. Just move the mouse cursor at windows desktop (in both 120 Hz mode and 60 Hz mode), and it's obviously visible. We have several 120 Hz monitors at our office, and everyone sees the difference.

Also I am personally very sensitive to light flickering. Bad CRT monitors at 60 Hz made my head explode just by watching the monitor briefly, and the headache lasted for many hours. I can clearly see the CRT flickering at 60 Hz, and I can even see it at 85 Hz (the default refresh rate for most CRTs). So I mostly ran my CRTs at 100 Hz, and I could still see the flickering at night time (dark room, no lights). Our eyes accumulate light (we do not see discrete images), so 100 Hz flickering should be completely invisible if we could only see 25 (accumulated) frames per second.
 
First you say I'm "very confused" and then you proceed with awfully vague statements, ending with a "lol".
Kudos for that participation.

Maybe you should consult a couple of chapters in some books about human optics, or you could just consult some chapters in the University of Utah's Webvision site.


BTW, what's wrong with this forum that whenever a guy tries to tell something from outside the hardware/software domain, there's always someone rematching the explanation with a
"You are very confused and I'll just leave it at that. (...) lol"?

What's the purpose of this?
Saying "you're wrong, I'm right, I won't bother telling you why, lol" makes people feel better with themselves?


Whatever, let's just stay on topic then.

I am sorry for my childish response lol.

:devilish:

All jokes aside, it is true that your eyes don't operate in Frames Per Second. And I'll just leave it at that.

:devilish: :devilish:

I will accept the banhammer with honor.
 
Back when I worked at a helpdesk, I spent some time on this issue also, as I found it interesting that I could always walk around in the office and instantly spot when someone's CRT was running at 60fps or not. I would always change settings for secretaries to 85hz, and almost all of them noticed they got less tired, and some lost the end of the day headaches. When I wanted to explain to them what was going on, I did the finger test as well.

It came to a point where labor safety laws imposed a 72hz minimum, though not many offices actually had people at the time who even knew what this meant. I could see it straight on, but if you want to be precise you just look from your peripheral vision and it's clear as day - your peripheral vision is way more sensitve to movement, and your focus is way more interested in detail, and is also being actively interpolating information that is not necessarily even there. But take corners in racing games which move the horizon very dramatically, and 30fps vs 60fps stands out plenty.

I took the MCAT on a CRT that was obviously refreshing at 60Hz. 4 hours staring at that crap was absolutely brutal. I filed a formal complaint after the test was over but I don't think anything was ever done about it. And this was in 2008, not exactly ancient history.
 
Man I spent 286-486 (8 yrs?) on a 60Hz CRT because monitors cost lots back then. How about the interlaced resolutions below 60Hz and blurry high dot pitches. Ack. The good old days are never remembered properly. ;)

I've always found it interesting how people may not see 60Hz as annoying unless you show them say 75Hz. And 75Hz might become annoying if you show them 85Hz. Brain is learrrning.....

Wrong.

For games, (or virtual reality simulations in general), motion blur does little good.
The core issue is that there is no way to determine just what should be blurred, because you do not know what the player is paying attention to.
Ya. Game motion blur simulates a guy walking around with a camera essentially. I usually turn that stuff off unless it's a driving game of some sort. Peripheral vision is enough blur for me.
 
This is why the 30 and 60fps have been pretty much standardized for console and PC gaming.
I'd love to see the official explanation of that standardisation. As I understand it, 60 Hz comes about from CRT's regulating with the AC current, which is why PAL runs at 50 hz (Wikipedia supports me on this). And once a display standard had been forced via electrical circuits, that forced the choice of content creation, which set a legacy we have been tied to. Now display is decoupled from the power, we can have any display refresh we want up to the limits of the display technology, but content still follows legacy pipelines and traditions. Hence we still get 50 fps DVDs in PAL despite 60 Hz TVs being ubiquitous.

No-one's ever picked a display framerate based on an Nyquist level or anything like AFAIK.
 
Wrong.

For games, (or virtual reality simulations in general), motion blur does little good.
The core issue is that there is no way to determine just what should be blurred, because you do not know what the player is paying attention to.
In many cases the blurred version might still look more natural, but blur shouldn't be too long and should give just hint of a direction of motion vectors.

But the problem should be 'very easy' to fix, just get ~120+ fps camera which captures pupil movement on top of the monitor and perform the motion blur and DoF within a refreshrate of a monitor independently from normal rendering framerate.. ;)
 
But the problem should be 'very easy' to fix, just get ~120+ fps camera which captures pupil movement on top of the monitor and perform the motion blur and DoF within a refreshrate of a monitor independently from normal rendering framerate.. ;)
Good in theory but in reality eyes shift their focus WAY faster than the lag between capture and actual change in the rendered image.
 
I'd love to see the official explanation of that standardisation. As I understand it, 60 Hz comes about from CRT's regulating with the AC current, which is why PAL runs at 50 hz (Wikipedia supports me on this). And once a display standard had been forced via electrical circuits, that forced the choice of content creation, which set a legacy we have been tied to. Now display is decoupled from the power, we can have any display refresh we want up to the limits of the display technology, but content still follows legacy pipelines and traditions. Hence we still get 50 fps DVDs in PAL despite 60 Hz TVs being ubiquitous.

No-one's ever picked a display framerate based on an Nyquist level or anything like AFAIK.

Douglas Trumbull when developing Showscan found that 72fps was the the peak before there was not difference to the viewer. Although for technical reasons again they used 60fps.
 
Back
Top