Basic said:
The framerate I said above is needed if you have a white vertical line on black background, moving fast horizontaly.
With high enough framerate it would be smeared into one gray area, but with "too low" framerate, you would see discrete lines. The limit I said is where the errors from not having an infinite fps is hidden by the bluring from spatial filtering.
I just think the number required to achieve that perception is lower than the number you propose calculating.
But of course it's theoretical, and not the framerate you would need in practical cases.
I think this means we agree?
It's just that this limit is the only one I can see that correlates directly to screen resolution.
I am making a distinction between an upper limit theoretically and what is actually required to be achieved for the goal as far as perception, as I don't think they are required to be the same thing.
If you decide that a lower framerate is enough, then it's not directly correlated to screen resolution, and looking at the same object(*) at a different resolution shouldn't need other framerate.
Why does frame rate have to correlate to screen resolution? To clarify this comment, I think we need to clarify the term flicker. When discussing refresh rate, I'm talking here about flicker as a matter of unsteadiness in the display device in question, and in a world of perfect LCDs (very low transition times) it would not occur.
The one you seem to be talking about here would get worse at lower resolutions I think, which I think would negate your point about correlating framerate to resolution to eliminate perceived change. i.e., you couldn't eliminate the perception of flicker of an animated scene at a low enough resolution because the picture element transition would be too coarse. I.e., there would be "jumping". This is related to how big the "pixel" of the display is to the viewer.
*) Here I realized that while I still stand by that statement, it's not the full story. The catch is that using a higher resolution often change what you're playing.
Higher resolution =>
=> enemies are visible at further distance =>
=> you're tracking smaller objects (measured in mm) =>
=> you need higher fps
Objects don't become smaller at higher resolution. They can be drawn more accurately, which might cause them to appear smaller in terms of screen space taken, but that would be because errors magnified them at the lower resolution, wouldn't it? Depending on the object shape and error of the rendering technique (i.e., no anti-aliasing), I think they might even be perceptionally smaller at lower resolutions...
So while the higher resolution don't change anything directly, the changed playing-style that comes with it might. This is kind of related to a (at first sight strange) comment I heard a lot earlier. A lot of people didn't like to play at high resolution, because everything got so small and hard to hit. It took me some time before I realized that they'd started to shoot stuff from further distance without thinking about it.
I can understand that idea, and it seems to agree with what I just said...?
I'm not sure which way you're debating wih the "pixel"/"picture element" comment.
My talk about that (second paragraph) was related to the refresh rate argument, my talk about pixel changes between frames (first paragraph) was about your framerate comments. Using the word flicker for both lends itself to confusion, I see now.
Thanks for fixing that later...
It seems as the argument says that higher resolution flickers less. (Notice that "picture element" size is constant for a monitor.)
Not really, I'm trying to say that at higher resolution, flicker (related to refresh rate deficiency) can be more noticeable because there are potentially more transitions to exhibit it. At lower resolution, the transition edge where flicker will occur is less of the area of each pixel, so the "worst case" limit should be less (on a sharp display that doesn't "auto" gradate like television). Drawing a screen half black and half white at different resolutions with the same refresh rate should not change the picture (unless the monitor changes behavior for some reason) or amount of flicker. But the higher resolution has both more opportunity to flicker (if drawing thinner lines) and an opportunity to reduce flicker (by gradating transitions), assuming both resolutions are displaying equally sharply and at the same refresh rate. TV just does some of that gradation naturally by its deficiency in sharpness (though interlacing introduces other opportunities for flicker) and by the nature of most content.
I definitely agree with that for interlaced monitors. On a TV with picture elements that are blury enough, you don't see the 25/30Hz flicker, just the 50/60Hz.
Yes, but my example with TV was an illustration of the color gradation at work, not intended to be directly correlated to resolution. Just lowering resolution won't have that effect, just limit the worst case amount possible (see above).
Your statement about "constant picture element size" above seems to agree with me, but I am stating everything out for clarity.
For low framerates (were I would call the error a "stroboscope effect" rather than "flicker"),
Yes, it will help to use different terms, heh....but I thought the other flicker (refresh rate) was more like a stroboscope effect? Ack, my head is spinning!
I would concider it relevant in the calculation I made in the last post. But not so much when doing the "high res=>different gaming" reasoning in this post.
Yes, and my comments addressing that are about the threshold for our pecreption not necessarily requiring achieving your criteria.
Perhaps it is just the use of the word "flicker" too broadly (by both of us, or just me? Brings to mind the classic Family Circus "Not me" cartoons...
) that confused some of this?