Display Rate and animation

pixelshader

Newcomer
Helllo..

Am trying to make an application something similar to
1) Window of width say 400 pixels
2) During every one second interval a vertical line moves from left of the window to right .

If i render at 400 FPS then the motion of line is smooth. But for lesser FPS like 100 the motion of line is jerky.

So my doubt is whether rendering at high FPS ( 400+ ) is only way to fool ur eye in to believing its smooth???
I was under wrong assumption that after a particular FPS eye wont be able 2 recognize any difference in FPS :?:
Am using opengl. I wud like to get input from u guys...
 
Silly question, but what are you using to display this 400Hz animation?
 
a 400hz monitor of course, silly
Given the dominance of LCDs which typically refresh at 60Hz I thought I should ask. (Besides, surely it'd have to be something like VGA (or QVGA :p) resolution on a CRT to get 400Hz refresh!)
 
Maybe he's not really interest in the visual refresh, but the logic update hz with the visual artefact he described simply spiking his curiosity? I know it's a stretch but... :eek:
 
So my doubt is whether rendering at high FPS ( 400+ ) is only way to fool ur eye in to believing its smooth???
I was under wrong assumption that after a particular FPS eye wont be able 2 recognize any difference in FPS :?:
Am using opengl. I wud like to get input from u guys...


Depends on how fast the animation is moving, contrast of colours involved and even if you're actaully interacting with something.

There are alot of myths about that 24/25/30/40/60fps are all the human eye can see but the human eye is an analogue device so theoretically the cap is inifinity. In practice however 30fps suffices for passive material and 60fps is enough for the vast majority of people playing a game. A few of the more fussy types may want up to 100fps in a fast FPS game. Beyond that extra smoothness can be observed but its very much diminishing returns. People also tend to notice something more when it's taken away. Eg someone might not notice if they changed from 60 to 100fps, but if they then played for some time at 100fps and got used to it, they'd be far more likely to notice going back to 60.

And ofcourse, as pointed out, you need a screen to support it, framerates in excess of the screens refresh rate do not give any benefit. Mostly why I'm still stuck on a CRT actually, no 100/120hz lcds around yet.
 
So my doubt is whether rendering at high FPS ( 400+ ) is only way to fool ur eye in to believing its smooth???
I was under wrong assumption that after a particular FPS eye wont be able 2 recognize any difference in FPS :?:
Currently your algorithm at 100hz skips 3 pixels every frame so the movement cannot look smooth.
The information of the line movement between those frames is not there, so the movement isn't smooth.

It's same that if you would move the line across the image during each frame, the line would be in the same place each frame.
You need proper motion blur to give you the information what happens during the frame and not just a single moment of time.
 
<snip>Mostly why I'm still stuck on a CRT actually, no 100/120hz lcds around yet.

True, although LCD's refresh rate is per pixel, not per frame so you can have different parts of the screen showing parts of different frames without the tearing effect of the CRTs w/o VSYNC. So, technically, while a person isn't seeing >60fps on LCDs, they can see >60 unique fps.
 
AKA (temporal) aliasing

I dont think temporal aliasing is the same, I mean eg: a propellor at certain speeds it can appear to be moving very slowly, stationary or even moving backwards

edit: just checked out temporal aliasing on wiki and your right - oops
 
True, although LCD's refresh rate is per pixel, not per frame so you can have different parts of the screen showing parts of different frames without the tearing effect of the CRTs w/o VSYNC. So, technically, while a person isn't seeing >60fps on LCDs, they can see >60 unique fps.

I was under the impression data is fed out over the DVI connection just as it is over standard VGA. Ie left to right, top to bottom, obviously the porches etc are not added to the digital signal. and that LCDs simply changed each pixel as they got them just as crts do during their scanning process.

the dell fp2001/8s I use at work show the same tearing as a crt.
 
I was under the impression data is fed out over the DVI connection just as it is over standard VGA. Ie left to right, top to bottom, obviously the porches etc are not added to the digital signal. and that LCDs simply changed each pixel as they got them just as crts do during their scanning process.

the dell fp2001/8s I use at work show the same tearing as a crt.

But when you spoke of sticking with the CRT were you talking about that annoying (and head-ache inducing) 60Hz feeling? For instance, I consider 75Hz the bare minimum to work for a few hours with 85Hz as a much more confortable experience (what my CRT is set to). On the other hand LCDs, despite their "60Hz" do not cause the same problem.
 
But when you spoke of sticking with the CRT were you talking about that annoying (and head-ache inducing) 60Hz feeling? For instance, I consider 75Hz the bare minimum to work for a few hours with 85Hz as a much more confortable experience (what my CRT is set to). On the other hand LCDs, despite their "60Hz" do not cause the same problem.

I've been using LCDs for so long I had forgotten the 60Hz CRT effect - that is until I took the MCAT (4+ hour test that requires a lot of reading) on a CRT that was obviously set to 60Hz. Ugh.
 
But when you spoke of sticking with the CRT were you talking about that annoying (and head-ache inducing) 60Hz feeling? For instance, I consider 75Hz the bare minimum to work for a few hours with 85Hz as a much more confortable experience (what my CRT is set to). On the other hand LCDs, despite their "60Hz" do not cause the same problem.


Nah, I was refering to the effective framerate limit (where I mentioned a few of the more fussy types, I am one of those)

I'm quite appreciative of the fact that LCDs don't flicker due to refresh rate, though some do flicker due to the way the backlight operates. Usually at a very fast rate ofcourse.
 
So my doubt is whether rendering at high FPS ( 400+ ) is only way to fool ur eye in to believing its smooth???
The frame rate at which the sampled movement of a stimulus looks smooth (that is, indistinguishable from continuous, non-sampled motion) depends entirely on stimulus parameters (i.e., its contrast, speed and spectrum). That really is an empirical question, you simply would need to measure it.
I was under wrong assumption that after a particular FPS eye wont be able 2 recognize any difference in FPS :?:
That assumption indeed is wrong. There is no magic number. There are of course practical answers regarding the desired minimum frame rate for typical games. But the question is slightly irrelevant, since display technology (LCD and CRT) is the limiting factor that sets the cap.
 
Back
Top