The higher resolution the higher framerate we need?

Discussion in 'Architecture and Products' started by MistaPi, Dec 18, 2002.

  1. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,360
    Likes Received:
    1,377
    Though I have agreed with pretty much all you've said thus far, I feel you aren't quite clear here. Nor does it fit my personal experience.

    We tend to lock on to edges, or more accurately, instances of high local contrast. Now, when we move, what we use to judge that movement (say an angular movement when rotating) is the movement of whatever high-contrast features we have in our field of view. Moving high contrast features are actually quite close to the worst case you describe above, and the problem simply does not go away at higher framerates, it just gets progressively smaller.

    Now, you used the phrase "when the motion starts to feel 'fluid'", which is a broad and very subjective description since it marks an interval when these problems are no longer percieved as intrusive. This is very strongly dependent on speed of movement and demands on accuracy. It also depends on training. Both of your perception, and general gameplay skill. For instance, if you have crummy precision to start with, you are less likely to find limitations in precision imposed by your framerate to be intrusive.

    To my eyes, CRTs simply aren't capable of a fully "fluid" experience.
    I would submit that good-enough-not-to-be-a-major-bother-as-long-as-you-move-like-a-slug isn't much of a worthy goal either. What I really find a bit frightening is that the standards that CRTs have reached have been greatly lowered by LCDs. What I absolutely don't want to see is LCD (or OLED or whatever) technology get stuck on an arbitrary value like 60 Hz as good enough, just because that particular number gets referred to often.

    Entropy
     
  2. Pete

    Pete Moderate Nuisance
    Moderator Legend

    Joined:
    Feb 7, 2002
    Messages:
    5,779
    Likes Received:
    1,816
    Higher res shouldn't make objects smaller in 3D games, just show them with, well, higher res. Unless by "smaller" you mean one-pixel "enemies" at 8x6 become three more-accurate pixels at 16x12, thus slightly smaller in terms of ratio--in which case the enemy is so far away it shouldn't matter.
     
  3. Nagorak

    Regular

    Joined:
    Jun 20, 2002
    Messages:
    854
    Likes Received:
    0
    I understand what you're saying, but I've never really experienced that things on my monitor aren't fluid. Fluid, IMO, is where I turn the mouse and end up looking where I intend (rather than stuttering past in low FPS). Yeah if you look closely you can see things shifting across the screen, but it's one of those things you just don't recognize unless you're looking for it.
     
  4. demalion

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,024
    Likes Received:
    1
    Location:
    CT
    I just think the number required to achieve that perception is lower than the number you propose calculating.

    I think this means we agree?

    I am making a distinction between an upper limit theoretically and what is actually required to be achieved for the goal as far as perception, as I don't think they are required to be the same thing.

    Why does frame rate have to correlate to screen resolution? To clarify this comment, I think we need to clarify the term flicker. When discussing refresh rate, I'm talking here about flicker as a matter of unsteadiness in the display device in question, and in a world of perfect LCDs (very low transition times) it would not occur.

    The one you seem to be talking about here would get worse at lower resolutions I think, which I think would negate your point about correlating framerate to resolution to eliminate perceived change. i.e., you couldn't eliminate the perception of flicker of an animated scene at a low enough resolution because the picture element transition would be too coarse. I.e., there would be "jumping". This is related to how big the "pixel" of the display is to the viewer.

    Objects don't become smaller at higher resolution. They can be drawn more accurately, which might cause them to appear smaller in terms of screen space taken, but that would be because errors magnified them at the lower resolution, wouldn't it? Depending on the object shape and error of the rendering technique (i.e., no anti-aliasing), I think they might even be perceptionally smaller at lower resolutions...

    I can understand that idea, and it seems to agree with what I just said...?

    My talk about that (second paragraph) was related to the refresh rate argument, my talk about pixel changes between frames (first paragraph) was about your framerate comments. Using the word flicker for both lends itself to confusion, I see now. :-? Thanks for fixing that later... ;)

    Not really, I'm trying to say that at higher resolution, flicker (related to refresh rate deficiency) can be more noticeable because there are potentially more transitions to exhibit it. At lower resolution, the transition edge where flicker will occur is less of the area of each pixel, so the "worst case" limit should be less (on a sharp display that doesn't "auto" gradate like television). Drawing a screen half black and half white at different resolutions with the same refresh rate should not change the picture (unless the monitor changes behavior for some reason) or amount of flicker. But the higher resolution has both more opportunity to flicker (if drawing thinner lines) and an opportunity to reduce flicker (by gradating transitions), assuming both resolutions are displaying equally sharply and at the same refresh rate. TV just does some of that gradation naturally by its deficiency in sharpness (though interlacing introduces other opportunities for flicker) and by the nature of most content.

    Yes, but my example with TV was an illustration of the color gradation at work, not intended to be directly correlated to resolution. Just lowering resolution won't have that effect, just limit the worst case amount possible (see above).

    Your statement about "constant picture element size" above seems to agree with me, but I am stating everything out for clarity.

    Yes, it will help to use different terms, heh....but I thought the other flicker (refresh rate) was more like a stroboscope effect? Ack, my head is spinning! ;)

    Yes, and my comments addressing that are about the threshold for our pecreption not necessarily requiring achieving your criteria.

    Perhaps it is just the use of the word "flicker" too broadly (by both of us, or just me? Brings to mind the classic Family Circus "Not me" cartoons... :p ) that confused some of this?
     
  5. Pete

    Pete Moderate Nuisance
    Moderator Legend

    Joined:
    Feb 7, 2002
    Messages:
    5,779
    Likes Received:
    1,816
    Entropy, how are CRTs not capable of delivering fluid framerates? They're not the limiting factor in most cases, but people with monstrous cards like the 9700 Pro seem to enjoy their "fluid" framerates.

    LCDs aren't being hobbled by an "arbitrary" 60Hz refresh; on the contrary, most LCDs can't show more than 60Hz even if they "refreshed" at 75Hz or above, simply because their pixel response is so slow. The fastest LCD on the market, a Hitachi (reviewed at GamePC) only claims a response time of 16ms. That's 62.5 refreshes a second, max. A more common "excellent" response time is 25ms rise+fall, which means 40fps max.

    Given that a CRT can refresh at 120Hz and above, I can't envision it as a limiting factor in fluid gameplay. Then again, I've never had a video card fast enough for me to consider my monitor slow. In fact, I only recently got an nForce with onboard GF2MX, and that only hits 100Hz in Half-Life mods (though I don't think my monitor is not displaying fluid framerates).
     
  6. Dio

    Dio
    Veteran

    Joined:
    Jul 1, 2002
    Messages:
    1,758
    Likes Received:
    8
    Location:
    UK
    That's exactly where I saw it too - the diving at the Barcelona olympics.

    That's a far better term for describing this problem, if you don't mind I'll steal that for when I have to bring this up (about once every 2-3 years :) )
     
  7. Dio

    Dio
    Veteran

    Joined:
    Jul 1, 2002
    Messages:
    1,758
    Likes Received:
    8
    Location:
    UK
    I've found that even my 'slow' 35ms response LCD feels fine to me - I just haven't seen the ghosting at all since the first week. Which I find strange, because I'm usually dead picky about my monitors.

    Now, I qualify these statements because I'm a wargamer, not a Quake-head, I can't even use vsync off because the tearing annoys me too much. Which isn't to say that I don't enjoy Q3 / UT, but really I wouldn't want to play with anyone that cares that much, they'd just clean the floor with me :)

    For me, I've never been able to tell any significant difference between 60Hz frame update rates and 120Hz frame update rates (except that at the higher rate, it's more likely to stutter because it's REALLY hard to ensure that you render a whole frame in 8.3ms every single frame). The only thing that I care about is the flickering. So the LCD is a plus point for me, because there's exactly zero flicker.
     
  8. Dio

    Dio
    Veteran

    Joined:
    Jul 1, 2002
    Messages:
    1,758
    Likes Received:
    8
    Location:
    UK
    It doesn't quite work like that - these figures are somewhat misleading. LCD crystals turn at a rate based on the field, and only a black-to-white transition actually occurs at the quoted response time. Smaller changes occur more slowly. IIRC the worst-case transition time is usually considered to be about double the quoted response time.

    Also IIRC, the Hitachi panel has more linear response times because it is able to 'overvoltage' to initiate the turn more quickly then settle to the 'right' value... I can't remember where I heard that, and I might be quite wrong.
     
  9. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,360
    Likes Received:
    1,377
    I thought that was quite clearly stated. :)
    We've been through this before, and the problem is general.

    Just because most people think NTSC TV is fine, does that mean that it's where we always want to be? Just because lots of people think 2 Mpix digicams are great, should I stop shooting MF and LF film? Just because most people think motion picture is smooth, does that mean that 24 mechanical fps jumping around on a big screen should be the limit for ever? Just because lots of people find AM radio to be OK should we cease to seek improvements?

    Personally, I have no problem whatsoever with telling the difference between 160 solid fps and 100 solid fps, and 160 is clearly better. Most all players at my level in my game of choice have the same ability. When I played through Dungeon Siege however, I used settings that gave me <20 fps, since the most critical interactivity was in managing my inventory, and I mostly spent my time enjoying the pretty spectacle. YMMV.

    But there are clear reasons why 60 fps is just an arbitrary number and no holy grail, as Basic has tried to explain here, and other people in other threads. If there even is such a number, it lies higher than the abilities of normal raster terminals, which was why I suggested using vector terminals if you really wanted to examine the issue.

    Entropy
     
  10. Dio

    Dio
    Veteran

    Joined:
    Jul 1, 2002
    Messages:
    1,758
    Likes Received:
    8
    Location:
    UK
    Most games exhibit marginally different behaviour at different FPS' (e.g. the Q3 125-fps resonance) usually because of the design of current physics engines. The two different approaches are still both detectable - fixed-dT physics will feel best at some multiple of the fixed rate, and variable-dT will become (generally) more accurate as frame rate rises.

    I'm not arguing that you can't do it but I am wondering if at least partly it could be these minor differences... what do you think?
     
  11. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,360
    Likes Received:
    1,377
    You are right about the slight variations in physics behaviour, but I've tried this in multiple variations and multiple test subjects, and once the test subject gets bored, he just rotates on the spot and gives his verdict. This doesn't involve the physics at all.

    It is even more apparent in movement obviously, but I don't really feel it in the physics much, or even at all. Only in my own movement, or in being able to accurately judge direction and speed of opponents. Since I've used a solid fps/screen refresh of 125 Hz, I find lower framerates disturbing, and higher framerates nicer, but if I had used a screen (and host) that could support a solid 200 Hz, all my experimentation so far implies that this would simply become the new baseline, and my level of acceptance would be further raised.

    Basic did a good job of explaining the mechanics earlier in the thread. What an individual considers acceptable will vary. But the level where for instance a doubling in fps is "imperceptible" would seem to lie very high.

    Edit: For a quick and dirty test of this, simply drag your mouse pointer quickly over the screen. Did you see smooth movement, or multiple instances of the pointer? Which proves that whatever framerate your screen runs on right now doesn't suffice to simulate smooth movement.
    Bonus points if you realize that your mouse polling rate has to be set high (or at least on USB) for the test to be valid. (And no, motion blur is no panacea. Lets avoid that discussion this time around shall we?)

    Entropy

    PS: Of course, if you are a competitive player, you want the framerate to be at least at a level where it does no longer affect your level of gameplay adversely. Which would be somewhere between "acceptable" and "imperceptible".
     
  12. BoddoZerg

    Regular

    Joined:
    Jul 8, 2002
    Messages:
    481
    Likes Received:
    0
    You know what... in real life, time does not happen smoothly either. Quantum mechanics tells us that no movement is truly continuous; it always happens in a series of jumps, just like mouse movement on a 60 Hz monitor. The fastest that anything can happen (we can think of this as the "framerate" of life) is the Planck Time, which is about 10^30 frames per second.

    So all we need to do is build a monitor that can display 10^30 frames per second and a computer that can render that, and then no one will ever have to worry about flickering ever again!


    Impractical? Of course. But its only a little bit less likely than someone making a monitor displaying 2000 fps. =/

    Flicker and percieved "unsmoothness" is in the eye of the beholder. To an overwhelming extent, the perception of "low framerate" is a trained response. Just as you can train people to perceive light polarization, something which most of us cannot do, computer gamers train themselves to detect the sutble errors in computer-rendered output that most people are not sensitive to because they are unimportant in real life.

    Back when I had a really slow computer, anything above 40 fps felt as smooth as silk. Now, after half a year with a 2GHz, GeForce4-equipped rig, I go back and play the same games on the same computer at the same 40-50 fps that used to look great, and they feel crawlingly slow. Similarly, I used to have a really hard time seeing aliasing, but after getting used to 2x and 4x MSAA, any edge aliasing at resolutions lower than 1600*1200 pops out like a big ugly surprise.

    The ability to see things like flicker, aliasing, and low framerate is a trained response just like seeing polarized light. Your eyes have all the proper receptors to detect light polarization; it's just that most of us aren't trained to use them, so our brains simply ignore the "polarization" data that is coming in through our eyes. Similarly, everyone has the capability to see fine differences in framerate, very minor aliasing, and very fast flickering; but since we don't usually use those senses we just ignore it. High level FPS players and people who run 3d benchmarking sites are two categories of people who are forced to see such differences, so they tend to become exceptionally sensitive to those problems with their display output.

    The human eye is much more sensitive than we usually give it credit for. When I went from 30-fps gaming to 85-fps gaming, my brain retrained itself to detect finer framerate differences; where before my brain would have ignored 50-fps strobing and jerkiness, now it tells me that 50 fps is slower than the 85 Hz it's used to. Give people a display with 200 Hz refresh rate and they'll just retrain their brains to detect the difference between 200 Hz and 300 Hz; the perception of "low framerate" will still be there.

    Most likely, the perception of "low framerate" is going to be something we will live with for a long time - until display systems reach some unimaginably fast level (2000 fps?) that is finally beyond the perception of the human brain.
     
  13. Pete

    Pete Moderate Nuisance
    Moderator Legend

    Joined:
    Feb 7, 2002
    Messages:
    5,779
    Likes Received:
    1,816
    I said "max." :) I've read full off to on is actually faster than gray to gray, hence the desire for feed forward.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...