The higher resolution the higher framerate we need?

Framerate and resolution are no dependant.

You need the same framerate (60fps) no matter the resolution.
 
well now i think minimum FPS is far more important than higher FPS, when i play BF1942, sometimes FPS below 60 FPS, i just wannna one day i can turn all candy on at 1024X768 and then minimum FPS can over 80 FPS, maybe i am dreaming that day . :D
 
Ingenu said:
Framerate and resolution are no dependant.

You need the same framerate (60fps) no matter the resolution.

Just checking ;) . So what this probably was referring to is refresh rate on a CRT monitor. Higher resolution have more lines to scan wich easier causes flicker?

About 60fps, thats a "standard" for smooth framerate, because its "enough" for most of us?
 
Flicker has nothing to do with frame rate nor resolution. If you have a refresh rate before 85 Hz you're going to see it flickering. For games 75 Hz is marginal, but for working on the desktop it is HELL. The lowest decent refresh rate I'd even consider using for the desktop is 85 Hz and then again the higher the better. The reason you notice more flickering at higher resolutions is probably because your monitor's refresh rate at higher resolutions is lower.
 
Nagorak said:
Flicker has nothing to do with frame rate nor resolution. If you have a refresh rate before 85 Hz you're going to see it flickering. For games 75 Hz is marginal, but for working on the desktop it is HELL. The lowest decent refresh rate I'd even consider using for the desktop is 85 Hz and then again the higher the better. The reason you notice more flickering at higher resolutions is probably because your monitor's refresh rate at higher resolutions is lower.
No kidding.
Thats why i mentioned flicker - it has to be what he is noticing
 
Are you sure about this? I think the flickering at 60hz are more horrible at 1600x1200 than it is at 800x600 (or is it just me?).
I read somewhere that we are more sensitive to flickering at higher resolution because the dots are smaller.
 
60Hz is foul at any resolution! But I'm picky about my frame rates.

I'd say I don't notice a difference between 75Hz 8x6 and 75Hz 16x12 - 75Hz being the lowest tolerable rate for me. 85Hz is always sufficient, I find.
 
The only reason I can see going for what you say is if the higher resolution makes you move closer to the monitor (to see the smaller dots).

The peripherial view is more sensitive to flicker than the center of view. And if you move closer to the monitor, you'll have part of the screen further away from the center of view (measured in º, not cm).

So the rule would actually be:
If the monitor occupy a larger part of your field of view, then you'll need higher screen refresh rate.

That was about screen refresh rate, not framerate.

For framerate the argument would be different. I guess you could say that nothing on the screen should move more than one pixel between two frames. Otherwise you'll see all moving object placed in several distinct position (stroboscope instead of motion blur). So if resolution goes up, you'd need higher framerates. That argument is however irrelevant since we are very far away from that ideal framerate on any resolution.
 
Basic said:
That argument is however irrelevant since we are very far away from that ideal framerate on any resolution.

heh, i get 400fps on quake2 at ultra low res :)
that is probably sufficient :p
 
A 360º turn in one second isn't fast in a FPS. For 320x240 with 90º FOV, you need >2000 fps to not have anything moving more than 1 pixel between two frames.

And then you need all of those 2000fps to actually hit every pixel on screen. (Disabling vSync is not enough.)
 
Umm...I don't think eliminating pixel changes between frames in that way matter directly. I think it successfully eliminates all "possibility" of perception of flicker to the limits of such an (as yet imaginery) display device, but that the threshold at which our perception can distinguish such is far far away from some of those proposed figures and completely unrelated to that goal (atleast in the mathematical scale that places no limit in relation to our ability to perceive).

The increase in perception of flicker is not due to the smaller pixel size, but the greater impact on existing flicker on the picture elements at sharp transitions in color (i.e., with bigger picture elements more of the picture element doesn't exhibit flicker at a sharp transition). I.e., if you had really small pixels but used them with smooth transitions and "big" picture elements, flicker doesn't matter (witness TV). Note: "pixels" = smallest picture element of the display, "picture elements" = smallest fundamental element of what is being displayed on said display.
 
Basic said:
A 360º turn in one second isn't fast in a FPS. For 320x240 with 90º FOV, you need >2000 fps to not have anything moving more than 1 pixel between two frames.

And then you need all of those 2000fps to actually hit every pixel on screen. (Disabling vSync is not enough.)

And of course, having higher resolution increases the fps requirement.
Demalion is right, of course, in that this is a calculated number which may not have much to do with what our perception requires. It is hard to test this though with typical raster screens since they have maximum refresh rates at some 200 Hz. If someone still has a E&S vector terminal, this could actually be tested with some sort of simple vector image. (Which would probably be a worst case).

But we do know that current CRTs do not allow fast enough refreshrates to get rid of artifacting. We need better than 200 Hz, at least. LCDs need not apply. :)

Entropy
 
If I understand this correctly you have to double the framerate when you double the resolution to keep the same sense of smoothness?
Then why all the talk about 60fps if its totally depended on resolution?
 
MistaPi said:
If I understand this correctly you have to double the framerate when you double the resolution to keep the same sense of smoothness?
Then why all the talk about 60fps if its totally depended on resolution?

The fixation on 60fps is due to the NTSC TV standard being 60Hz interlaced, so it's a number that feels natural to people who watch NTSC TV.
The reason NTSC TV uses 60Hz interlaced is that AC power in the USA is 60 Hz, and has nothing whatsoever to do with human perception.

Entropy
 
MistaPi said:
If I understand this correctly you have to double the framerate when you double the resolution to keep the same sense of smoothness?
Then why all the talk about 60fps if its totally depended on resolution?
No, because for it to be smooth i dont think you need to only move one pixel per frame.
because as pixels get smaller, (higher res) a movement of two pixels could be the same size as a movement of one pixelf rom a lower res. Saying one is smooth and the other is not is a fallacy.
 
The framerate I said above is needed if you have a white vertical line on black background, moving fast horizontaly.
With high enough framerate it would be smeared into one gray area, but with "too low" framerate, you would see discrete lines. The limit I said is where the errors from not having an infinite fps is hidden by the bluring from spatial filtering.

But of course it's theoretical, and not the framerate you would need in practical cases. It's just that this limit is the only one I can see that correlates directly to screen resolution. If you decide that a lower framerate is enough, then it's not directly correlated to screen resolution, and looking at the same object(*) at a different resolution shouldn't need other framerate.

*) Here I realized that while I still stand by that statement, it's not the full story. The catch is that using a higher resolution often change what you're playing.
Higher resolution =>
=> enemies are visible at further distance =>
=> you're tracking smaller objects (measured in mm) =>
=> you need higher fps

So while the higher resolution don't change anything directly, the changed playing-style that comes with it might. This is kind of related to a (at first sight strange) comment I heard a lot earlier. A lot of people didn't like to play at high resolution, because everything got so small and hard to hit. It took me some time before I realized that they'd started to shoot stuff from further distance without thinking about it.


I'm not sure which way you're debating wih the "pixel"/"picture element" comment. It seems as the argument says that higher resolution flickers less. (Notice that "picture element" size is constant for a monitor.) I definitely agree with that for interlaced monitors. On a TV with picture elements that are blury enough, you don't see the 25/30Hz flicker, just the 50/60Hz.
For low framerates (were I would call the error a "stroboscope effect" rather than "flicker"), I would concider it relevant in the calculation I made in the last post. But not so much when doing the "high res=>different gaming" reasoning in this post.

[Edit]
This post was mostly a reply to what demalion said above.

MistaPi:
What I was saying was: While there are ways to deduce a reason that higher res would need higer fps, the arguments aren't strong enough to be of much concern. Possibly with the exception of the "different gaming" argument in this post.

There are different kinds of errors in real time gfx, and they disapear/get less anoying at different framerates. So it's natural that we can deduce very different "good framerates". One error is when you sense the time steps in the gfx (there was one frame, and there's another). This is removed at lower fps (be it 25fps, 60fps or 90fps, all depending on the viewer). This is when the motion starts to feel "fluid", and what I would say the most important fps.
Then you have the error that whatever framerate you have, it's possible to have a bright object fly past the view in such a high speed that it leaves a trail of distict objects. To remove this you could need a very high fps.
 
Basic said:
On a TV with picture elements that are blury enough, you don't see the 25/30Hz flicker, just the 50/60Hz.
Actually if you see something on standard TV captured with a short virtual open-shutter time, and therefore no motion blur, it's very noticeable and I find it quite disturbing.

They used to do it occasionally with sports events - by using a fast virtual shutter they get better replays, but at the cost of a 'strange' picture.

Increasingly nowadays they use high-frame-rate cameras and average the frames to get the motion blur back if they want high quality slow-motion replays - I haven't seen the effect in a while.

At the moment most of our rendering has infinite shutter speed. Therefore we can't get away from this kind of effect - and motion blur is one thing that really nobody has much of a solution for yet except massive oversampling.
 
Dio:
Yes, I've also noticed and got irritated by that. It's most visible when watching swiming. Water splashes looks very unnatural.

But I would classify it as a "stroboscope error", rather than a "flicker error". My TV-comment was about reducing "flicker error" by bluring the lines of each half-image so that they actually cover the area where the other half-images lines shoud be.
 
Back
Top