Yes, but how many FPS do we actually see? The Nth iteration of an always fresh debate

sebbbi

Veteran
yes, they have nicely decked out machines (hence my 570's comment) but that doesn't mean they'll turn on more bells & whistles, just more things that keep them over 250fps.
250 fps does no good. Minimum frame rate of 120 fps is what pros are aiming at. There's no use rendering more, since 120 Hz gaming monitors cannot show any more.
 
250 fps does no good. Minimum frame rate of 120 fps is what pros are aiming at. There's no use rendering more, since 120 Hz gaming monitors cannot show any more.

but I thought the human eye could only see 25 fps? :p
 
but I thought the human eye could only see 25 fps? :p
You are wrong.

It's easy to see the difference between 120 Hz and 60 Hz refresh rate. Just move the mouse cursor at windows desktop (in both 120 Hz mode and 60 Hz mode), and it's obviously visible. We have several 120 Hz monitors at our office, and everyone sees the difference.

Also I am personally very sensitive to light flickering. Bad CRT monitors at 60 Hz made my head explode just by watching the monitor briefly, and the headache lasted for many hours. I can clearly see the CRT flickering at 60 Hz, and I can even see it at 85 Hz (the default refresh rate for most CRTs). So I mostly ran my CRTs at 100 Hz, and I could still see the flickering at night time (dark room, no lights). Our eyes accumulate light (we do not see discrete images), so 100 Hz flickering should be completely invisible if we could only see 25 (accumulated) frames per second.

Some (old?) sources claim 25 fps is enough if the image frames are motion blurred ( incoming light is accumulated during the frame period). 25 fps is pretty good for motion blurred videos, but you will see a striking difference to 60 fps in scenes where there's lot of sideways movement. This is one of the (many) reasons why stereoscopic (3d) videos look pretty bad currently. Eyes seem to notice the low frame rate better on stereoscopic content. And I am not talking about active shutter glass flickering here, I am talking about the actual frame rate of the stereoscopic image stream (24/25 fps for both eyes doesn't seem to be enough, we need to shoot at 60 fps x 2 at least).
 
250 fps does no good. Minimum frame rate of 120 fps is what pros are aiming at. There's no use rendering more, since 120 Hz gaming monitors cannot show any more.

The point is not 250fps - that's just a side effect of the clean and simple picture, making it easier to spot the enemies.
(unless we are talking certain older engines where you can run faster / jump better at some framerates).
 
You are wrong.

See the :p icon? I was luring someone out of a corner.

From the matches I've attended (demonstration and Intel Extreme Masters) FPS is locked at 250 for MW. all things like screens et.c are mostly irrelevant since the players will already complain about 59.1 or 60 Hz screens.

All this crap about what people assume to be professional gaming is taking this thread way offtopic, you're missing the gems of hints here.
 
You are wrong.

It's easy to see the difference between 120 Hz and 60 Hz refresh rate. Just move the mouse cursor at windows desktop (in both 120 Hz mode and 60 Hz mode), and it's obviously visible. We have several 120 Hz monitors at our office, and everyone sees the difference.

Also I am personally very sensitive to light flickering. Bad CRT monitors at 60 Hz made my head explode just by watching the monitor briefly, and the headache lasted for many hours. I can clearly see the CRT flickering at 60 Hz, and I can even see it at 85 Hz (the default refresh rate for most CRTs). So I mostly ran my CRTs at 100 Hz, and I could still see the flickering at night time (dark room, no lights). Our eyes accumulate light (we do not see discrete images), so 100 Hz flickering should be completely invisible if we could only see 25 (accumulated) frames per second.

Some (old?) sources claim 25 fps is enough if the image frames are motion blurred ( incoming light is accumulated during the frame period). 25 fps is pretty good for motion blurred videos, but you will see a striking difference to 60 fps in scenes where there's lot of sideways movement. This is one of the (many) reasons why stereoscopic (3d) videos look pretty bad currently. Eyes seem to notice the low frame rate better on stereoscopic content. And I am not talking about active shutter glass flickering here, I am talking about the actual frame rate of the stereoscopic image stream (24/25 fps for both eyes doesn't seem to be enough, we need to shoot at 60 fps x 2 at least).

Humm.. well the first mistake is to claim that someone "sees" XY Hz.

Human vision doesn't work like a camera, there's isn't anything remotedly similar to a video system that captures a fixed number of frames-per-second through the whole capture area.

First, there's the fact that everything depends of luminance and contrast, even "refresh rate perception". So either it's a high-contrast, high luminance monitor or there's lots of daylight coming in or all windows are shut etc. makes a some difference. And it's not linear..

Then, there's the fact that both precision (equivalent to cameras' resolution) and speed (equivalent to refresh rates) perception depend a lot on the position of the field of vision.

The center vision is indeed very slow, as evolution has adapted it for precision rather than speed. The ~25fps perception is actually correct for central vision under , even if it changes a lot depending on the person.
As you go from the center to the peripheral vision, precision is disregarded in favor of speed. Evolution, once again, made our peripheral vision a sensor system against predators and hazards, so it's a lot faster than central vision. AFAIK, it can go up to 120Hz.




So what does all of this have to do with office monitors and suddenly people starting to notice the difference between 60 and 120Hz monitors?
Size.

Whereas 10 years ago we'd have 15->19" monitors, the panels are a lot bigger now.
And larger panels means a larger proportion of our vision is occupied by the monitor.
It means that not only our central vision is capturing the monitor content but also some of our fast-as-hell peripheral vision.
Having our peripheral vision looking at the monitors is what makes people notice between medium and high frame/refresh rates, and causes headaches and other problems.



Sorry for the slight off-topic, but I think it's important not to generalize "vision" as if we had a pair of digital cameras attached to our brain.

Furthermore, given this fact, I think it could be a very interesting subject of research to study a variable level-of-detail (anti-aliasing, texture resolution, etc) across the screen based on how close the pixels are to the center.
It could make a sizeable performance difference in games where people are always looking at the center of the screen (FPS, for example), and of course it would only work for large screens in desktops (where the FOV is mostly occupied by the screen).
 
All this crap about what people assume to be professional gaming is taking this thread way offtopic, you're missing the gems of hints here.

With all this, and all your posts about BF3, it doesn't have to be hints - people wouldn't be able to find it anyway :D

Furthermore, given this fact, I think it could be a very interesting subject of research to study a variable level-of-detail (anti-aliasing, texture resolution, etc) across the screen based on how close the pixels are to the center.

Indeed, especially eyefinity IMHO isn't really viable until I can have less detail and resolution on the peripheral monitors. (
icon14.gif
for the rest of the post btw )
 
Indeed, especially eyefinity IMHO isn't really viable until I can have less detail and resolution on the peripheral monitors. (
icon14.gif
for the rest of the post btw )

I do sometimes find myself looking at the peripheral monitors directly though (e.g. wing mirrors in F1 2010). It would be jarring to have a beautifully rendered image on the main screen and then a much lower quality one to the side.

I'm also not convinced that your vision tops out at 25hz in the central 'high detail' portion, as you can still notice a difference in detail between 25hz and 120hz images (panning text, for example), as what is unreadable in one becomes readable in the other, and as far as I know, you only read using your central 'sharp' vision.
 
I'm also not convinced that your vision tops out at 25hz in the central 'high detail' portion, as you can still notice a difference in detail between 25hz and 120hz images (panning text, for example)
Yes. You can easily notice the difference of 60 fps and 120 fps at center of your vision, if you are watching discrete computer generated image sequences (no motion blur). The difference is very clear. Simple test case: move your mouse cursor in a small circle at center of your desktop. However once we count in motion blur (accumulated light instead of discrete frames), it should be pretty hard to distinguish 60 fps and 120 fps (25 fps -> 120 fps however is a completely different matter).
 
I do sometimes find myself looking at the peripheral monitors directly though (e.g. wing mirrors in F1 2010). It would be jarring to have a beautifully rendered image on the main screen and then a much lower quality one to the side.

Yes, obviously. Which is why I said it would be convenient for first-person shooters. Maybe also third-person shooters, depending on the camera's FOV.



I'm also not convinced that your vision tops out at 25hz in the central 'high detail' portion, as you can still notice a difference in detail between 25hz and 120hz images (panning text, for example), as what is unreadable in one becomes readable in the other, and as far as I know, you only read using your central 'sharp' vision.

There's two more things..

Yes, you read using the central vision. However, if you're panning text through the full screen, your peripheral vision will probably detect the slower rate and you'll be "suggested" of the low frame rate. Try panning text within a small window, look at it and see if you get the same results.


Also, just because you "see" at around 25Hz it doesn't mean a 25fps source will seem "real life" to you. First, your vision could be 23fps, could be 28, depends on a lot of things as I said and even on your concentration levels. That alone could be enough to perceive the same frame twice.
And even if your vision was "fixed" at 25Hz, it doesn't mean it'll "vsync" with the monitor.
Even a small delay between your vision and the monitor's refresh rate would result in your eyes "sampling" the same frame from time to time, giving you the notion of stuttering.
This is why 30fps is considered as "minimum" for playing games, because the chances of your vision sampling the same frame reduce considerably.

Going a bit further, the only way to make sure you'll never "sample" the same frame is to follow Nyquist's theorem and have the monitor throwing twice the refresh rate of what your central vision can ever withstand: 60fps vsynced with 60Hz of refresh rate.

This is why the 30 and 60fps have been pretty much standardized for console and PC gaming.




Yes. You can easily notice the difference of 60 fps and 120 fps at center of your vision, if you are watching discrete computer generated image sequences (no motion blur). The difference is very clear. Simple test case: move your mouse cursor in a small circle at center of your desktop.

What you're seeing isn't the perception of the monitor's refresh rate, but rather the ghosting effect from a slow white-to-black response time. 120Hz monitors are generally made for stereo 3D, which means they absolutely need to have excelent balck-to-white response times, or all you'd see was a blurred image with the glasses on.
The same doesn't happen with standard 60Hz monitors, where a slow white-to-black response time is a lot more usual than some might think (most manufacturers claim "false" response times of 5ms, as they correspond to light grey to dark grey transitions, for example).
 
A good LCD blur test is just about any dark contrasty action game, like FEAR for example. Maybe just find a HD Youtube video. The LCD frequently has to go from black to white and you'll see weakness quickly. Whether or not any blur will bother you is personal preference of course.
 
Also, just because you "see" at around 25Hz it doesn't mean a 25fps source will seem "real life" to you. First, your vision could be 23fps, could be 28, depends on a lot of things as I said and even on your concentration levels. That alone could be enough to perceive the same frame twice.
And even if your vision was "fixed" at 25Hz, it doesn't mean it'll "vsync" with the monitor.
Even a small delay between your vision and the monitor's refresh rate would result in your eyes "sampling" the same frame from time to time, giving you the notion of stuttering.
This is why 30fps is considered as "minimum" for playing games, because the chances of your vision sampling the same frame reduce considerably.

You are very confused and I'll just leave it at that. Well I'll at least say that your eyes don't have a discrete sampling rate and you don't see in FPS. There is no way your eyes would not "vsync" with your monitor lol.

As for what framerate is enough, this is a loaded question. It all depends on what you're viewing and whether or not proper motion blur is applied. But I will say that 30fps is just fine for almost all types of games if there is perfect full scene motion blur.
 
You are very confused and I'll just leave it at that. Well I'll at least say that your eyes don't have a discrete sampling rate and you don't see in FPS. There is no way your eyes would not "vsync" with your monitor lol.

As for what framerate is enough, this is a loaded question. It all depends on what you're viewing and whether or not proper motion blur is applied. But I will say that 30fps is just fine for almost all types of games if there is perfect full scene motion blur.

Indeed, 24 fps is fine for theater movies. However, on the opposite side of the coin, military aviation tests have shown humans to be able to pick out a dot that flashes on screen for only a 1/220th of a second. However, that is a black dot on a white background.

Really, the only answer to the question is, eyes aren't video cards or cameras, you can't rate them the same way. 30 fps would be good enough, if the motion blur effects were realistic enough. The reason people think they need 60 fps, or 120 fps is because real time graphics aren't good enough for your brain to confuse it with reality, and thus your brain thinks its fake and notices details you normally wouldn't in trying to pick out the details between frames.
 
Actually, cameras are much closer to how the eyes are "working" -- they are both capturing a continuous stream of light exposure with natural temporal coherence. The difference is that the camera "chops" and stores this continuous exposure in linked frames. In the case of computer generated images, this temporal coherency simply doesn't exists as precondition and must be artificially added (a.k.a. motion blur) to mimic the natural phenomena, with variable success depending on the method and field of application.
 
Indeed, 24 fps is fine for theater movies. However, on the opposite side of the coin, military aviation tests have shown humans to be able to pick out a dot that flashes on screen for only a 1/220th of a second. However, that is a black dot on a white background.
The trick is that they can't still see "220 pictures a second", human brains don't see "frames" or "pictures" per second, they see a constant flow of information - in normal daylight the thresold is around 70-75Hz - anything over that and your brain starts to mix them up more and more the faster things change in front of your eyes. White background and dot showing up for 1/220th of a second? Sure, NP. But try to put in 3 dot pictures with couple white pictures in between you probably won't be able to say wether there was 1, 2 or 3 dot pictures in there.

So in ~daylight around 70-75Hz should be optimal, anything over and your brain starts to kick in your own "motion blur" figuratively speaking, the darker it gets the lower the thresold should be, dunno how much higher it gets when the lighting gets brighter than daylight though, if at all.

(disclaimer: of course every person is a individual, and might have different thresolds)
 
You are very confused and I'll just leave it at that. Well I'll at least say that your eyes don't have a discrete sampling rate and you don't see in FPS. There is no way your eyes would not "vsync" with your monitor lol.

First you say I'm "very confused" and then you proceed with awfully vague statements, ending with a "lol".
Kudos for that participation.

Maybe you should consult a couple of chapters in some books about human optics, or you could just consult some chapters in the University of Utah's Webvision site.


BTW, what's wrong with this forum that whenever a guy tries to tell something from outside the hardware/software domain, there's always someone rematching the explanation with a
"You are very confused and I'll just leave it at that. (...) lol"?

What's the purpose of this?
Saying "you're wrong, I'm right, I won't bother telling you why, lol" makes people feel better with themselves?


Whatever, let's just stay on topic then.
 
The center vision is indeed very slow, as evolution has adapted it for precision rather than speed. The ~25fps perception is actually correct for central vision under .
That is blatantly wrong , If you are basing this "25FPS" figure on the measurement of the optic never imulse , which lasts for 1/25th of a second , then your whole point is baseless , because that measurement alone doesn't accurately represent the complexities of the human eye perception.

The short answer for the famous question "What is the frame rate of the human eye?" is very simple , it has not yet been recorded , and it varies with the circumstances around you , or in other words , The eye adapts it's frame rate depending on your life style and your daily activities .

Pilots have been known to distinguish images that are flashed in a 1/220th of a second , consequently they are able to interpret very high count of fps .

When the Adrenaline is flowing through the blood "usually in dangerous situations" , it boosts the general capabilities of the human body , (brain , muscles .. etc ) , the eye included , allowing it to detect much higher FPS than normal .

In fact it is very naive to think that a biological device like the eye is operating at maximum capacity 24/7 , when in practice no other biological system does that . The human organs adapt and react to their environment and adjust their capacities accordingly .

They eye is a very complex organ , over-simplifying it's capabilities to a simple number on a chart is very naive , it's true potential has not been fully uncovered yet , We are still finding out about it to this day .
 
Pilots have been known to distinguish images that are flashed in a 1/220th of a second , consequently they are able to interpret very high count of fps .
That test is completely flawed as point, they had 219 white frames and 1 frame with image (though iirc it was 1/200, not 220 but irrelevant), but go and put 220 different images there instead and it's all a big blur
 
That test is completely flawed as point, they had 219 white frames and 1 frame with image (though iirc it was 1/200, not 220 but irrelevant), but go and put 220 different images there instead and it's all a big blur
True , however I didn't include this test to imply that the eye can distinguish a frame rate count of 220 fps , it is just another evidence that points toward the capability of the eye to see high fps , be it 60 , 100 , 120 .. etc .
 
I'm not a game dev or even a programmer (just bash and perl/python scripts for now), but I but I am in IT and fairly advanced, and I have been heavily into PC gaming since 1991. Used to play CounterStrike and Call of Duty 1 for money even. I mostly lurk here to learn from you guys and generally only post to ask questions and to offer a demanding old-school gamer's thoughts to devs who may not necessarily care about our opinion in this mass market age of powerful consoles and focus groups.

I didn't switch to LCD for games until the 120Hz models came out and used a CRT until 2010 because the difference between 60fps and 100+ is HUGE (what you need in CS, COD, and other games to take full advantage of how the engine works over the network). I'm not even touching 30 which is much much worse. Keep in mind the latency of a 120Hz frame is 8ms instead of 16 or 32, we can tell. I tried the fastest 60Hz LCDs and they all gave me a headache and the blurring would hurt my eyes in a few minutes and I would have to stop. When I tried the first 120Hz model, it was the blur from just opening your eyes in the morning suddenly being lifted.

I had a Sony FW900 CRT that could go up to 160Hz, and for kicks one day I enablen vsync (which you don't need at high refresh and frame rates as the tearing becomes almost impossible to see) and tried every refresh rate from 60 to 160, in 10Hz increments. I clearly saw and felt a difference until about 120-140hz/fps (and yes I had the .cfg file customized to make sure the frame rate never dropped below the refresh rate).

Why is it every time that people like me who aren't pulling this crap out of their ass say that we can tell the difference, there is immediate backlash basically calling us crazy or liars?
 
Back
Top