Fixed framerate: 60 fps is enough

60 Hz max and the best IQ that can be done at that?

  • Just no

    Votes: 0 0.0%
  • 30 Hz minimum and higher IQ would be better

    Votes: 0 0.0%
  • I don't care

    Votes: 0 0.0%

  • Total voters
    226
Chalnoth said:
No, definitely not. The 1/8th of a second is not the time it takes the eye to move and refocus. It is the time (approximately: it varies depending upon brightness and color) that it takes one rod to "expose" and send its signal to the brain.
Not true. The latency of photoreceptor cells is in the order of a few hundred microseconds. Earliest cortical responses for visual stimuli are detected at around 40 milliseconds after presentation. And this is after the visual information has traveled from the retina to the thalamus and from there to the visual cortex.

(note: rods dominate our vision for anything we are focusing on).
No, it's the cones as you yourself mention below.
 
I still think you are underestimating the visual system Chalnoth. While it may be true that it takes 1/8th of a second to process something you see, it is a steady stream of neurotransmitters that are being released from your rods and cones (and yes, I know that rods are at the periphery and are black/white/brightness, while the fovea is comprised mainly of cones with a few rods interspersed).

So basically the photorecepters in your retina (underneath all of the vascular structures that feed them) respond to stimulus from light. Once a photon hits those cells, they release neurotransmitters. As these cells keep getting hit by photons, they continually let out neurotransmitters in a steady stream. Essentially any new stimulus to these cells will elicit a response. So no matter if a monitor is refreshing itself 85 times a second, or even 240 times a second, there is going to be a change in color and intensity for any pixel affected by movement. Our eyes can pick up those differences very quickly, and that stream if information is relayed to the brian via nerves and the neurotransmitters connecting them all. All of the light that you see is essentially transformed into raw "data" and is making its way to your brian where you can identify that data and make something useful out of it. We essentially live in an analog world where information is constantly streaming towards us in the form of light (and not as discrete packets or frames). As such, we are used to seeing an uninterupted stream of information. Basically, the more fps you can display on a screen, the more our brain will tell us that what we are seeing is real.

As for motion blur, are you saying that if you are playing something like UT, and you are gettin 120g fps, and a highly detailed model of a sparrow flies from the left hand side to the right of the screen in a quarter of a second, will you see 30 distinct frames of that model in that time? I don't think so. Most likely, you will say, "WTF was the blurry thing that shot across the screen?" See? Motion blur without having to use accumulation buffer effects.
 
Bolloxoid said:
Chalnoth said:
No, definitely not. The 1/8th of a second is not the time it takes the eye to move and refocus. It is the time (approximately: it varies depending upon brightness and color) that it takes one rod to "expose" and send its signal to the brain.
Not true. The latency of photoreceptor cells is in the order of a few hundred microseconds. Earliest cortical responses for visual stimuli are detected at around 40 milliseconds after presentation. And this is after the visual information has traveled from the retina to the thalamus and from there to the visual cortex.
Yes, I guess I can buy that, but this still doesn't change my argument on temporal aliasing.

This is simply due to the fact that computers render in discreet steps (note: even if our eyes were discreet there would still be a problem with temporal aliasing: in fact, it would be worse). This means that no framerate will be high enough. There will always be something that could move fast enough to not be resolved for a given framerate, and the framerates for fully-resolving even relatively slow-moving objects are pretty rediculously high (resolving to the resolution of the screen, i.e. no more than one pixel per frame motion).

This means that independent of the actual characteristics of the human visual subsystem, you will get temporal aliasing. The only way to fix this is to actually break up the aliasing, by some method that destroys the hard edges that are created when a macroscopic object on the screen moves by more than a single pixel between frames.
 
Chalnoth said:
So, I suppose to talk about the human visual system, you do have to talk about its two different parts: the rods and cones. The rods respond to brightness, and respond very quickly. These dominate our peripheral vision and our vision in the dark. Cones respond to color, much more slowly, and dominate the vision of objects we focus upon. They therefore are the dominant form of vision that is important when talking about games. Rods are more important for flicker, cones for the image that is displayed.
Actually, rod cells respond slower than cones and, with respect to the topic of the discussion, are irrelevant since their input is practically nonexistent in photopic lighting conditions.

It is true that peripheral vision has better temporal resolution. This is, however, due to the properties of ganglion cells and not photoreceptor cells.
 
Bolloxoid said:
Chalnoth said:
That's an understatement. It takes the eye about 1/8th of a second to get a full image.
That is a misunderstanding. Temporal summation in the visual systems indeed happens in a 1/8-second window, but that is a different thing.=
I believe it also depends on the intensity of the light - the more light, the faster the visual system functions.
 
I believe it also depends on the intensity of the light - the more light, the faster the visual system functions.

I think that is debatable. I don't think that the system will necessarily go faster, but the increased stimulus will elicit an increased response. Nerve cells and ganglions can only propogate the impulse so fast, so no matter the amount of neurotransmitter a nerve receives at one end, it will only run that impulse along the axon at (I think it is about) 230 MPH. You could dump a bucket of neurotransmitter on the receptor ends of a nerve cell, and it will not affect the speed of the impulse running to the next nerve cell.
 
Lots of misinformation in this thread....

The refreshrate of your eye sensors is between 50 and 70 Hz. (Depending on color) That's a simple biological fast. It the reason that's monitor running at >75 Hz is perceived as stable and pleasant.

The actual reaction time from eye to brain is faster. A black screen, giving a pulse, can be received in a time corresponding to 250 Hz.
BUT: A second pulse, would NOT be registered, if it came faster than 75Hz, because of the refreshrate of the eye.

So, although you can think of extreme situations, where very high framerates would be neccessary, in practice the limit is simply around 75Hz (fps).
 
Ylandro said:
So, although you can think of extreme situations, where very high framerates would be neccessary, in practice the limit is simply around 75Hz (fps).
Jesus Christ, why do people spout this kind of nonsense WHEN YOU CAN EASILY TEST IT!!!??

Use Q3, set geometric detail to low, set resolution to something low, 640x480 should be sufficient to get roughly 400-500 fps out of any halfway decent system today. Next, set com_maxfps to 80 and the screen refresh to 160 Hz. Use any map, DM17 for the sake of history. Stay put and rotate rapidly on the spot. Next, set com_maxfps to 160. Rotate again.

The difference is bleeding obvious to a blind man in a dark room, no need to be a skilled player.
To experienced players the difference is not only obvious visually, but definitely affects their game play. If you're not a skilled player, you'll have to take my word for it. Other games have different demands.
Now, GO MAKE THE EXPERIMENT!
 
Ylandro said:
So, although you can think of extreme situations, where very high framerates would be neccessary, in practice the limit is simply around 75Hz (fps).
Please read the posts in this thread. The discussion has centered on the presentation of movement and the difficulties associated with that (temporal aliasing) and the remedies (either increasing the refresh and frame rate of the display device or applying temporal antialiasing techniques).

Refresh rates of 85-100 Hz do indeed eliminate flicker, nobody has contradicted that. Representing fast movement on-screen is another thing altogether.
 
60 fps is not enough for one (of the major) simple reason(s). Many people run their monitor at 85 or 100 Hz and with only 60 fps that would make vsync an impossibility without triple buffering.

Besides, I personally see a difference between 60 "vsynced" fps and 85 "vsynced" fps, not only in the flickering but in the smoothness of motion. And very much so in the difference of the mouse performaqnce, it just feels more responsive the higher your fps goes, well beyond the point where you stop seeing any difference in actual motion smoothness.
So does most people I know that play computer games as well. Some others though can barely see the difference between 40 inconsistent fps and 85 vsynced fps...

Basing the arguments on theory makes no sense to me if there are actually people who percieve the difference. (Counting them all out as placebo effects doesn't make any sense either.)
 
Fixed 60 FPS is very good, if it means the framerate never drops 60, which is something you won't even achieve in doom or farcry with a 6800U (at high detail).
60 minimum would usually correspond to 120hz average fps score. It's the low frames that make a game (un)playable. Good enough.
 
DiGuru said:
digitalwanderer said:
60 fps is just peachy, 60Hz gives me headaches though.

Good point. I should have said fps, but I always try to avoid the confusion with FPS. I'll change it.


No, because 60FPs is a waste of time if your screen refresh rate is 43hz interlaced. Just like 200fps is useless if your refresh rate is 60hz.

In my opinion you can tell the difference above 60fps, but only, say, up to about 100FPS because lower framerates are only noticeable in certain situations, like when an object flies across your screen so fast it only actually appears in two frames. Consider this object (say a rocket in doom3) If you are at 50FPS then over 40ms period you can see this object, but because it is renderd you see it in two places. Once in frame 1, on the left and second in frame two when its made its way to the right.

There is no 'inbetween' hence you can tell. In real life the object would leave a streak of 'detection' across your retina. You would see it blurred and faint because it spends hardly any time imprinting its image on an individual part of the retina, but you would see it along its entire path, as apposed two snapshots of it in perfect clarity.

If you show that object at 100FPs then you would get 4 images of it on your retina, at 200fps then 8. At 500,000 Fps you would see it 20000 times, which would produce the 'smear long its path' on your retina as accurate as your screen resolution allows **note**.

So basically, you need as much FPS as possible, but other limitations come into play. If an object imprints on your retina for too short a time then you aren't going to notice it because there is not enough time to build up the image to a level that the brain can recognise and interpret as 'not the wall behind whats going on'.

Its this time that will vary from person to person, but its a case of the faster it moves the 'fainter' it appears. I personally settle for 100FPS at vsync, at this FPS playing counterstrike it is rare that I can see an object moving fast enough to skip a significant number of pixels each frame.


**note**.....

The bigger your screen and the lower your screen resolution the more you can do about a lower framerate, because a pixels difference in position represents a larger distance across your retina at lower resolutions. dont think thats gonna make a huge difference though - but it does give you a minimum speed for the smoothest you can possibly render any arbitrary scene.

Dave-
 
Dave B(TotalVR) said:
In my opinion you can tell the difference above 60fps, but only, say, up to about 100FPS because lower framerates are only noticeable in certain situations, like when an object flies across your screen so fast it only actually appears in two frames. Consider this object (say a rocket in doom3) If you are at 50FPS then over 40ms period you can see this object, but because it is renderd you see it in two places. Once in frame 1, on the left and second in frame two when its made its way to the right.


200fps is very noticably smoother in a fast game like q3 to me than 100fps (thats at 200Hz). Conflict freespace and freespace 2 (those fairly old space combat sims from volition) also give a far more convincing sense of motion when on a screen running at 160/200Hz than 100Hz (old games so have limited res or i'd just run at high res 100Hz)

So, although you can think of extreme situations, where very high framerates would be neccessary, in practice the limit is simply around 75Hz (fps).

no because the cells in your eyes are not all synced, they expose when they get hit not due to some clock generator in your eye. This raises the limit substantially.

75Hz is far from flicker free. Particularly on newer monitors as they have shorter phosphor glow times. 85Hz is an absolute minimum imo, and even then I actually find 85Hz a little flickery after almost exclusively using 100Hz for several months.

overall it does depend on the game, would be best to just keep things how they are imo and let the user decide what IQ/speed balance they want.
 
Dave B(TotalVR) said:
No, because 60FPs is a waste of time if your screen refresh rate is 43hz interlaced. Just like 200fps is useless if your refresh rate is 60hz.
It is not useless if the game engine runs at 200 updates/s. In that case, you can have a more responsive and smoother game experience even if your display can't show that many discrete frames per second. Specifically, Quake engine games allow you to perform better on higher fps numbers because there are more intermediary steps of physics calculations done.

Temporal aliasing does not refer to the visual domain alone. There is temporal aliasing in game physics/AI/controls too.
 
Dave B(TotalVR) said:
In my opinion you can tell the difference above 60fps, but only, say, up to about 100FPS because lower framerates are only noticeable in certain situations, like when an object flies across your screen so fast it only actually appears in two frames.
No matter your framerate, you'll still have temporal aliasing. One way to think of this is that data will be missing between frames no matter what, due to the simple fact that frames are rendered in small, discreet steps. This aliasing becomes extremely noticeable in a number of different scenarios. One that I already pointed out is strafing around a corner. If you do this, you will see the corner's image "ghosted" multiple times as you move. To me, this is rather distracting (note: you need pretty high framerates to see this effect).
 
Bolloxoid said:
Dave B(TotalVR) said:
No, because 60FPs is a waste of time if your screen refresh rate is 43hz interlaced. Just like 200fps is useless if your refresh rate is 60hz.
It is not useless if the game engine runs at 200 updates/s. In that case, you can have a more responsive and smoother game experience even if your display can't show that many discrete frames per second. Specifically, Quake engine games allow you to perform better on higher fps numbers because there are more intermediary steps of physics calculations done.

Temporal aliasing does not refer to the visual domain alone. There is temporal aliasing in game physics/AI/controls too.

How would you notice, if all the feedback you get is 60 frames per second? You'll never see the other 140 frames, so how is that better?

You may notice very high framerates on lower refresh rates because the corresponding minimum framerates will usually be high aswell, but I much doubt minimum FPS above refresh can be experienced. That only happens with really old games like quake3 on a fast rig.
 
Sandwich said:
How would you notice, if all the feedback you get is 60 frames per second? You'll never see the other 140 frames, so how is that better?

Because basically the game is integrating with a timestep. The smaller the timestep the more accurate the simulation is. Therefore in games where timestep is linked to framerate the faster the FPS are the more accurate the simulations are.

Now if you only see 60 FPS but internal game updates at 200 Hz places where that will come into play are in things that are moving while rotating. Movement will look much more fluid the higher the internal sampling is even if you only see part of those.
 
Cryect said:
Because basically the game is integrating with a timestep. The smaller the timestep the more accurate the simulation is. Therefore in games where timestep is linked to framerate the faster the FPS are the more accurate the simulations are.

Now if you only see 60 FPS but internal game updates at 200 Hz places where that will come into play are in things that are moving while rotating. Movement will look much more fluid the higher the internal sampling is even if you only see part of those.
There is no integration going on here that would affect that one bit.
 
The distance you can jump in q3 varies depending on framerate due to the way its worked out. In newer patches theres an option that fixes this afaik. The OSP mod also has a fix for it.

Most games aren't affected by bugs like this however.
 
Back
Top