The higher resolution the higher framerate we need?

Basic said:
The There are different kinds of errors in real time gfx, and they disapear/get less anoying at different framerates. So it's natural that we can deduce very different "good framerates". One error is when you sense the time steps in the gfx (there was one frame, and there's another). This is removed at lower fps (be it 25fps, 60fps or 90fps, all depending on the viewer). This is when the motion starts to feel "fluid", and what I would say the most important fps.
Then you have the error that whatever framerate you have, it's possible to have a bright object fly past the view in such a high speed that it leaves a trail of distict objects. To remove this you could need a very high fps.

Though I have agreed with pretty much all you've said thus far, I feel you aren't quite clear here. Nor does it fit my personal experience.

We tend to lock on to edges, or more accurately, instances of high local contrast. Now, when we move, what we use to judge that movement (say an angular movement when rotating) is the movement of whatever high-contrast features we have in our field of view. Moving high contrast features are actually quite close to the worst case you describe above, and the problem simply does not go away at higher framerates, it just gets progressively smaller.

Now, you used the phrase "when the motion starts to feel 'fluid'", which is a broad and very subjective description since it marks an interval when these problems are no longer percieved as intrusive. This is very strongly dependent on speed of movement and demands on accuracy. It also depends on training. Both of your perception, and general gameplay skill. For instance, if you have crummy precision to start with, you are less likely to find limitations in precision imposed by your framerate to be intrusive.

To my eyes, CRTs simply aren't capable of a fully "fluid" experience.
I would submit that good-enough-not-to-be-a-major-bother-as-long-as-you-move-like-a-slug isn't much of a worthy goal either. What I really find a bit frightening is that the standards that CRTs have reached have been greatly lowered by LCDs. What I absolutely don't want to see is LCD (or OLED or whatever) technology get stuck on an arbitrary value like 60 Hz as good enough, just because that particular number gets referred to often.

Entropy
 
Higher res shouldn't make objects smaller in 3D games, just show them with, well, higher res. Unless by "smaller" you mean one-pixel "enemies" at 8x6 become three more-accurate pixels at 16x12, thus slightly smaller in terms of ratio--in which case the enemy is so far away it shouldn't matter.
 
Entropy said:
To my eyes, CRTs simply aren't capable of a fully "fluid" experience.
I would submit that good-enough-not-to-be-a-major-bother-as-long-as-you-move-like-a-slug isn't much of a worthy goal either. What I really find a bit frightening is that the standards that CRTs have reached have been greatly lowered by LCDs. What I absolutely don't want to see is LCD (or OLED or whatever) technology get stuck on an arbitrary value like 60 Hz as good enough, just because that particular number gets referred to often.

Entropy

I understand what you're saying, but I've never really experienced that things on my monitor aren't fluid. Fluid, IMO, is where I turn the mouse and end up looking where I intend (rather than stuttering past in low FPS). Yeah if you look closely you can see things shifting across the screen, but it's one of those things you just don't recognize unless you're looking for it.
 
Basic said:
The framerate I said above is needed if you have a white vertical line on black background, moving fast horizontaly.
With high enough framerate it would be smeared into one gray area, but with "too low" framerate, you would see discrete lines. The limit I said is where the errors from not having an infinite fps is hidden by the bluring from spatial filtering.

I just think the number required to achieve that perception is lower than the number you propose calculating.

But of course it's theoretical, and not the framerate you would need in practical cases.

I think this means we agree?

It's just that this limit is the only one I can see that correlates directly to screen resolution.

I am making a distinction between an upper limit theoretically and what is actually required to be achieved for the goal as far as perception, as I don't think they are required to be the same thing.

If you decide that a lower framerate is enough, then it's not directly correlated to screen resolution, and looking at the same object(*) at a different resolution shouldn't need other framerate.

Why does frame rate have to correlate to screen resolution? To clarify this comment, I think we need to clarify the term flicker. When discussing refresh rate, I'm talking here about flicker as a matter of unsteadiness in the display device in question, and in a world of perfect LCDs (very low transition times) it would not occur.

The one you seem to be talking about here would get worse at lower resolutions I think, which I think would negate your point about correlating framerate to resolution to eliminate perceived change. i.e., you couldn't eliminate the perception of flicker of an animated scene at a low enough resolution because the picture element transition would be too coarse. I.e., there would be "jumping". This is related to how big the "pixel" of the display is to the viewer.

*) Here I realized that while I still stand by that statement, it's not the full story. The catch is that using a higher resolution often change what you're playing.
Higher resolution =>
=> enemies are visible at further distance =>
=> you're tracking smaller objects (measured in mm) =>
=> you need higher fps

Objects don't become smaller at higher resolution. They can be drawn more accurately, which might cause them to appear smaller in terms of screen space taken, but that would be because errors magnified them at the lower resolution, wouldn't it? Depending on the object shape and error of the rendering technique (i.e., no anti-aliasing), I think they might even be perceptionally smaller at lower resolutions...

So while the higher resolution don't change anything directly, the changed playing-style that comes with it might. This is kind of related to a (at first sight strange) comment I heard a lot earlier. A lot of people didn't like to play at high resolution, because everything got so small and hard to hit. It took me some time before I realized that they'd started to shoot stuff from further distance without thinking about it.

I can understand that idea, and it seems to agree with what I just said...?

I'm not sure which way you're debating wih the "pixel"/"picture element" comment.

My talk about that (second paragraph) was related to the refresh rate argument, my talk about pixel changes between frames (first paragraph) was about your framerate comments. Using the word flicker for both lends itself to confusion, I see now. :-? Thanks for fixing that later... ;)

It seems as the argument says that higher resolution flickers less. (Notice that "picture element" size is constant for a monitor.)

Not really, I'm trying to say that at higher resolution, flicker (related to refresh rate deficiency) can be more noticeable because there are potentially more transitions to exhibit it. At lower resolution, the transition edge where flicker will occur is less of the area of each pixel, so the "worst case" limit should be less (on a sharp display that doesn't "auto" gradate like television). Drawing a screen half black and half white at different resolutions with the same refresh rate should not change the picture (unless the monitor changes behavior for some reason) or amount of flicker. But the higher resolution has both more opportunity to flicker (if drawing thinner lines) and an opportunity to reduce flicker (by gradating transitions), assuming both resolutions are displaying equally sharply and at the same refresh rate. TV just does some of that gradation naturally by its deficiency in sharpness (though interlacing introduces other opportunities for flicker) and by the nature of most content.

I definitely agree with that for interlaced monitors. On a TV with picture elements that are blury enough, you don't see the 25/30Hz flicker, just the 50/60Hz.

Yes, but my example with TV was an illustration of the color gradation at work, not intended to be directly correlated to resolution. Just lowering resolution won't have that effect, just limit the worst case amount possible (see above).

Your statement about "constant picture element size" above seems to agree with me, but I am stating everything out for clarity.

For low framerates (were I would call the error a "stroboscope effect" rather than "flicker"),

Yes, it will help to use different terms, heh....but I thought the other flicker (refresh rate) was more like a stroboscope effect? Ack, my head is spinning! ;)

I would concider it relevant in the calculation I made in the last post. But not so much when doing the "high res=>different gaming" reasoning in this post.

Yes, and my comments addressing that are about the threshold for our pecreption not necessarily requiring achieving your criteria.

Perhaps it is just the use of the word "flicker" too broadly (by both of us, or just me? Brings to mind the classic Family Circus "Not me" cartoons... :p ) that confused some of this?
 
Entropy, how are CRTs not capable of delivering fluid framerates? They're not the limiting factor in most cases, but people with monstrous cards like the 9700 Pro seem to enjoy their "fluid" framerates.

LCDs aren't being hobbled by an "arbitrary" 60Hz refresh; on the contrary, most LCDs can't show more than 60Hz even if they "refreshed" at 75Hz or above, simply because their pixel response is so slow. The fastest LCD on the market, a Hitachi (reviewed at GamePC) only claims a response time of 16ms. That's 62.5 refreshes a second, max. A more common "excellent" response time is 25ms rise+fall, which means 40fps max.

Given that a CRT can refresh at 120Hz and above, I can't envision it as a limiting factor in fluid gameplay. Then again, I've never had a video card fast enough for me to consider my monitor slow. In fact, I only recently got an nForce with onboard GF2MX, and that only hits 100Hz in Half-Life mods (though I don't think my monitor is not displaying fluid framerates).
 
Basic said:
Dio:
Yes, I've also noticed and got irritated by that. It's most visible when watching swiming. Water splashes looks very unnatural.

But I would classify it as a "stroboscope error", rather than a "flicker error".
That's exactly where I saw it too - the diving at the Barcelona olympics.

That's a far better term for describing this problem, if you don't mind I'll steal that for when I have to bring this up (about once every 2-3 years :) )
 
Entropy said:
What I really find a bit frightening is that the standards that CRTs have reached have been greatly lowered by LCDs. What I absolutely don't want to see is LCD (or OLED or whatever) technology get stuck on an arbitrary value like 60 Hz as good enough, just because that particular number gets referred to often.
I've found that even my 'slow' 35ms response LCD feels fine to me - I just haven't seen the ghosting at all since the first week. Which I find strange, because I'm usually dead picky about my monitors.

Now, I qualify these statements because I'm a wargamer, not a Quake-head, I can't even use vsync off because the tearing annoys me too much. Which isn't to say that I don't enjoy Q3 / UT, but really I wouldn't want to play with anyone that cares that much, they'd just clean the floor with me :)

For me, I've never been able to tell any significant difference between 60Hz frame update rates and 120Hz frame update rates (except that at the higher rate, it's more likely to stutter because it's REALLY hard to ensure that you render a whole frame in 8.3ms every single frame). The only thing that I care about is the flickering. So the LCD is a plus point for me, because there's exactly zero flicker.
 
Pete said:
A more common "excellent" response time is 25ms rise+fall, which means 40fps max.
It doesn't quite work like that - these figures are somewhat misleading. LCD crystals turn at a rate based on the field, and only a black-to-white transition actually occurs at the quoted response time. Smaller changes occur more slowly. IIRC the worst-case transition time is usually considered to be about double the quoted response time.

Also IIRC, the Hitachi panel has more linear response times because it is able to 'overvoltage' to initiate the turn more quickly then settle to the 'right' value... I can't remember where I heard that, and I might be quite wrong.
 
Entropy said:
Now, you used the phrase "when the motion starts to feel 'fluid'", which is a broad and very subjective description since it marks an interval where these problems are no longer percieved as intrusive. This is very strongly dependent on speed of movement and demands on accuracy. It also depends on training. Both of your perception, and general gameplay skill.

I thought that was quite clearly stated. :)
We've been through this before, and the problem is general.

Just because most people think NTSC TV is fine, does that mean that it's where we always want to be? Just because lots of people think 2 Mpix digicams are great, should I stop shooting MF and LF film? Just because most people think motion picture is smooth, does that mean that 24 mechanical fps jumping around on a big screen should be the limit for ever? Just because lots of people find AM radio to be OK should we cease to seek improvements?

Personally, I have no problem whatsoever with telling the difference between 160 solid fps and 100 solid fps, and 160 is clearly better. Most all players at my level in my game of choice have the same ability. When I played through Dungeon Siege however, I used settings that gave me <20 fps, since the most critical interactivity was in managing my inventory, and I mostly spent my time enjoying the pretty spectacle. YMMV.

But there are clear reasons why 60 fps is just an arbitrary number and no holy grail, as Basic has tried to explain here, and other people in other threads. If there even is such a number, it lies higher than the abilities of normal raster terminals, which was why I suggested using vector terminals if you really wanted to examine the issue.

Entropy
 
Entropy said:
Personally, I have no problem whatsoever with telling the difference between 160 solid fps and 100 solid fps, and 160 is clearly better.
Most games exhibit marginally different behaviour at different FPS' (e.g. the Q3 125-fps resonance) usually because of the design of current physics engines. The two different approaches are still both detectable - fixed-dT physics will feel best at some multiple of the fixed rate, and variable-dT will become (generally) more accurate as frame rate rises.

I'm not arguing that you can't do it but I am wondering if at least partly it could be these minor differences... what do you think?
 
Dio said:
Entropy said:
Personally, I have no problem whatsoever with telling the difference between 160 solid fps and 100 solid fps, and 160 is clearly better.
Most games exhibit marginally different behaviour at different FPS' (e.g. the Q3 125-fps resonance) usually because of the design of current physics engines. The two different approaches are still both detectable - fixed-dT physics will feel best at some multiple of the fixed rate, and variable-dT will become (generally) more accurate as frame rate rises.

I'm not arguing that you can't do it but I am wondering if at least partly it could be these minor differences... what do you think?

You are right about the slight variations in physics behaviour, but I've tried this in multiple variations and multiple test subjects, and once the test subject gets bored, he just rotates on the spot and gives his verdict. This doesn't involve the physics at all.

It is even more apparent in movement obviously, but I don't really feel it in the physics much, or even at all. Only in my own movement, or in being able to accurately judge direction and speed of opponents. Since I've used a solid fps/screen refresh of 125 Hz, I find lower framerates disturbing, and higher framerates nicer, but if I had used a screen (and host) that could support a solid 200 Hz, all my experimentation so far implies that this would simply become the new baseline, and my level of acceptance would be further raised.

Basic did a good job of explaining the mechanics earlier in the thread. What an individual considers acceptable will vary. But the level where for instance a doubling in fps is "imperceptible" would seem to lie very high.

Edit: For a quick and dirty test of this, simply drag your mouse pointer quickly over the screen. Did you see smooth movement, or multiple instances of the pointer? Which proves that whatever framerate your screen runs on right now doesn't suffice to simulate smooth movement.
Bonus points if you realize that your mouse polling rate has to be set high (or at least on USB) for the test to be valid. (And no, motion blur is no panacea. Lets avoid that discussion this time around shall we?)

Entropy

PS: Of course, if you are a competitive player, you want the framerate to be at least at a level where it does no longer affect your level of gameplay adversely. Which would be somewhere between "acceptable" and "imperceptible".
 
Basic said:
A 360º turn in one second isn't fast in a FPS. For 320x240 with 90º FOV, you need >2000 fps to not have anything moving more than 1 pixel between two frames.

And then you need all of those 2000fps to actually hit every pixel on screen. (Disabling vSync is not enough.)

You know what... in real life, time does not happen smoothly either. Quantum mechanics tells us that no movement is truly continuous; it always happens in a series of jumps, just like mouse movement on a 60 Hz monitor. The fastest that anything can happen (we can think of this as the "framerate" of life) is the Planck Time, which is about 10^30 frames per second.

So all we need to do is build a monitor that can display 10^30 frames per second and a computer that can render that, and then no one will ever have to worry about flickering ever again!


Impractical? Of course. But its only a little bit less likely than someone making a monitor displaying 2000 fps. =/

Flicker and percieved "unsmoothness" is in the eye of the beholder. To an overwhelming extent, the perception of "low framerate" is a trained response. Just as you can train people to perceive light polarization, something which most of us cannot do, computer gamers train themselves to detect the sutble errors in computer-rendered output that most people are not sensitive to because they are unimportant in real life.

Back when I had a really slow computer, anything above 40 fps felt as smooth as silk. Now, after half a year with a 2GHz, GeForce4-equipped rig, I go back and play the same games on the same computer at the same 40-50 fps that used to look great, and they feel crawlingly slow. Similarly, I used to have a really hard time seeing aliasing, but after getting used to 2x and 4x MSAA, any edge aliasing at resolutions lower than 1600*1200 pops out like a big ugly surprise.

The ability to see things like flicker, aliasing, and low framerate is a trained response just like seeing polarized light. Your eyes have all the proper receptors to detect light polarization; it's just that most of us aren't trained to use them, so our brains simply ignore the "polarization" data that is coming in through our eyes. Similarly, everyone has the capability to see fine differences in framerate, very minor aliasing, and very fast flickering; but since we don't usually use those senses we just ignore it. High level FPS players and people who run 3d benchmarking sites are two categories of people who are forced to see such differences, so they tend to become exceptionally sensitive to those problems with their display output.

The human eye is much more sensitive than we usually give it credit for. When I went from 30-fps gaming to 85-fps gaming, my brain retrained itself to detect finer framerate differences; where before my brain would have ignored 50-fps strobing and jerkiness, now it tells me that 50 fps is slower than the 85 Hz it's used to. Give people a display with 200 Hz refresh rate and they'll just retrain their brains to detect the difference between 200 Hz and 300 Hz; the perception of "low framerate" will still be there.

Most likely, the perception of "low framerate" is going to be something we will live with for a long time - until display systems reach some unimaginably fast level (2000 fps?) that is finally beyond the perception of the human brain.
 
Dio said:
It doesn't quite work like that - these figures are somewhat misleading. LCD crystals turn at a rate based on the field, and only a black-to-white transition actually occurs at the quoted response time. Smaller changes occur more slowly. IIRC the worst-case transition time is usually considered to be about double the quoted response time.
I said "max." :) I've read full off to on is actually faster than gray to gray, hence the desire for feed forward.
 
Back
Top