Physical limitations of the human vision system and the impact on display tech

zed

Legend
Yes, and your eyes would create it naturally if looking at video played back at sufficiently high framerate.
Perhaps I've only seen games running at 60/90hz, I dont know what the minimum is? 240hz? perhaps that is the minimum I would need to see some details.
But remember its not just the monitor that needs to hit 240hz the GPU/CPU need to as well, or else you're just displaying some frames 2x or 3x. I just had a google the most powerful CPU/GPUs in SLI dont even make that 240fps with current games (*).
So until games actually reach 240(or whatever it is) then motionblur is needed to approach what we see with our eyes

(*)though perhaps some hardware hits current games at 240fps, from a quick look on anandtech 2 geforce 980ti's in SLI hover around 80-120fps
 
I dont know what the minimum is? 240hz?
There is no threshold. It depends on the scene content. For motion blur to resolve naturally, the frame-to-frame delta of objects in the user's visual field needs to be tight enough that, when subsequent frames are blended, the result looks like a smooth trail instead of a series of ghost images. If the camera is static with a wide field of view and the only moving object is a truck twenty feet away moving at the speed of a slug, 5fps would probably be much more than enough. For helicopter blades, it's obviously inadequate. For a mouse cursor moving very fast across your desktop? Well, that one's easy to test.

If you jerk the mouse back and forth at 60fps, you might feel like you're able to simultaneously see a pack of cursor images.
At 120fps, you'd see double the cursor images at half spacing.
At 240fps, quadruple the cursor images at quarter spacing.
Crank the fps high enough, and the cursor ghost images in your vision would be so close together that they'd blend into a smooth MB trail (obviously you can't actually do this, but it's easy to see how it works).
 
Last edited:
Of course there is, our eyes/brains do not resolve at infinite FPS.
The question wasn't "what's the temporal resolution of the eye."
If that was the issue, we'd just blast our eyes with alternating bright/dark imagery at high frequencies and figure out where it smooths out. A 60Hz CRT displaying a white image has a flicker pattern sort of like a 120fps video of alternating black/white; at 60Hz the flickering is faintly visible. Crank the CRT up to 100Hz and it probably won't be visible at all. So as a vague ballpark guess, 200fps is probably good enough for the human visual system (although depending on what exactly we're talking about, we might still have SAH blur and whatnot to deal with).

The question was at what framerate non-motion blurred video will appear naturally motion blurred to a human viewer.
This isn't a question of the temporal resolution of our eyes so much as it is a function of how fast something is moving across the visual field.
Double the speed of the mouse cursor, and you double the gaps between visible cursor ghosts, halving the smoothness of the accumulated blend result that occurs in the eye.
Pick any stupidly high hypothetical FPS, and I can come up with some stupidly high-speed object that would lack correct blurring to the viewer's eyes at that FPS.

Now, if we're allowing the video to be motion blurred, then we're (roughly speaking) back to the first question. A 1000000000000fps unblurred video of a mouse cursor isn't going to be particularly distinguishable from a 500fps video of the same mouse cursor but with "perfectly accurate" motion blur included. But, if neither video had motion blur, you could still have quite visible ghost cursors at 500fps; that this is way above what we can temporally distinguish is irrelevant.

//===========================

This whole issue is analogous to the question of "what resolution do you need at X screen size and Y viewing distance." If we're talking about resolution/clarity, we can give an answer. But when people start talking about rendering resolution, particularly in cases where you can experience spatial aliasing, it breaks the question; you could stand 60 feet from a 30-inch screen, and it'll still be quite obvious when a super-bright subpixel-width element is flickering as the scene moves. It's impossible to pick a resolution high enough that any hypothetical detail won't visibly alias.
 
Last edited:
The question was at what framerate non-motion blurred video will appear naturally motion blurred to a human viewer.
correct
Pick any stupidly high hypothetical FPS, and I can come up with some stupidly high-speed object that would lack correct blurring to the viewer's eyes at that FPS.
Yes you are correct speed plays a role, though of course its not just speed but actually perceived speed = speed / distance. thus a train going 1000km/hr at 10 meters away will travel more over the FOV (and blur more) than one 10km away.
BTW I dont know if you're aware (from memory so details may be wrong) the eye actually only see's 20% of the time, thus our brain fills in the gaps, if movement happens in that 20% does our brain see it? ;)

Pick any stupidly high hypothetical FPS, and I can come up with some stupidly high-speed object that would lack correct blurring to the viewer's eyes at that FPS.
OK I'll pick the helicopter rotors, go for it
 
OK I'll pick the helicopter rotors, go for it
He said pick an FPS. So 5000 fps, say. It's easy enough to come up with a case where you'd get ghosting rather than smooth motion, given a large enough screen and high enough delta. Although of course high enough deltas become too fast for the human eye to even see. So yeah, a bullet may strobe on your 5000 fps screen, but it's not like anyone would see it!
 
Yes you are correct speed plays a role, though of course its not just speed but actually perceived speed = speed / distance. thus a train going 1000km/hr at 10 meters away will travel more over the FOV (and blur more) than one 10km away.
Yeah, that's why I was usually using wording like "speed across the visual field." I abbreviated it that time, thinking it was implied that I'd be able to choose things like the object's location.

BTW I dont know if you're aware (from memory so details may be wrong) the eye actually only see's 20% of the time, thus our brain fills in the gaps, if movement happens in that 20% does our brain see it? ;)
I'm fairly certain that you're mixing something up. Our eyes are basically constantly exposing; it's possible to reliably see individual bright flashes, even when they only occur for tiny fractions of a second. A good example is camera flash; some flash setups operate with 1/1000s shutter or even much faster, and you can reliably see it, as long as the brightness*time of the flash is sufficient.

There are phenomena where our visual perception shuts down, such as saccades, but not generally during continuous idle or eye-tracking visual periods. And that's a matter of the brain fudging our perception, not eye exposure.

OK I'll pick the helicopter rotors, go for it
I said pick a framerate.

Helicopter rotors are a funky case due to their cyclic motion though; if you selected a specific framerate, I could pick a rotor rate that would visibly alias to a small number of blades, but if you selected a range, there'd be ratios where you'd get enough overlapping images to eliminate it as an issue.

That's not an issue for linear motion, though. For instance...
He said pick an FPS. So 5000 fps, say. It's easy enough to come up with a case where you'd get ghosting rather than smooth motion, given a large enough screen and high enough delta. Although of course high enough deltas become too fast for the human eye to even see. So yeah, a bullet may strobe on your 5000 fps screen, but it's not like anyone would see it!
...you'd just have to pick a very bright bullet.

Pick one billion FPS, and I'll pick a bullet that's going so fast across your visual field that it shows up four frames, and I'll also specify that it's 100 million times brighter than other things in the frame, so it'll still trigger your retina, and you'd get four images of a bright bullet in your vision. Adjust the framerate by a bit and the issue wouldn't go away.

Is that a reasonable case? Not really, but it illustrates the lack of a threshold. Is there a region where people would stop caring? Probably, but that's not the same as having a threshold where the issue can be said to disappear.
 
Last edited:
I said pick a framerate.
OK sorry misread, OK a trillion FPS :) Now thats less than infinite, you're claiming we can tell the difference between 1trillion and 1triilion+1 FPS!

google it theres lots of ppl saying there is a limit to the hz rate humans can see, yet I could not find a single person saying there is no limit.
Link to me one single paper saying a person can see a flash of an incredibly short time.

every other human sense has limits yet you're saying human site WRT Hz rate is the only one that doesnt? I'm not buying it
 
OK sorry misread, OK a trillion FPS :)
I already basically answered this.

I say I have a hypothetical object travelling fast enough that it crosses the visual field in 3 or 4 frames or whatever.
And I choose it to be bright enough that it trips your rods and cones to create a visible light accumulation at every frame (say, a trillion times brighter than a typical object in the scene).

Now thats less than infinite, you're claiming we can tell the difference between 1trillion and 1triilion+1 FPS!
I never claimed that we can distinguish between proportionally tiny differences in framerate.

Although, if I'm allowed to use helicopter blade aliasing, I actually could choose very specific rotor frequencies that would alias differently between those two specific framerates. For example, a helicopter rotor spinning at exactly 1 trillion Hz would look perfectly fixed in space at 1 trillion fps, but at 1 trillion+1 fps it would have a 1Hz rotation because of the slight offsets in position at the different frame samples.

google it theres lots of ppl saying there is a limit to the hz rate humans can see, yet I could not find a single person saying there is no limit.
Those people are addressing temporal resolution. I'm talking about light accumulating at spatially-separated regions over an accumulation period.

I'm not claiming that you can directly distinguish the high framerates in a temporal sense. For example, it would be basically impossible to distinguish between the 3 ghost images that occur in your visual system when viewing the super-bright bullet in the 1 trillion fps video, and 3 (less bright since they're displayed for longer) ghost images baked into a single frame of a 500fps video.

Link to me one single paper saying a person can see a flash of an incredibly short time.
Not sure about a paper addressing that exact question, but that's because researchers sort of take it for granted. When they ask about registering minimum visibility, they phrase in ways like "what's the least number of photons that will register a visual response." Whether those photons enter the eye over a femtosecond or a nanosecond or a microsecond isn't really relevant, as long as they all enter closely enough that the sluggish responses given by our visual receptors are all in progress simultaneously. That's not the same thing as saying we can distinguish femtoseconds; if something high-intensity happened at femtosecond 1, and something else happened at femtosecond 5, we could see both, but we wouldn't be able to distinguish which order they happened in.

Photographers in more casual context certainly think that they couldn't make their flashes so fast as to be invisible, as long as they're bright enough.

The eyes accumulate light energy over a duration. Asking whether you can make a flash so fast that it's invisible is like asking if it's there's a duration that you can physically push an object for that's so brief that it can't move said object. If you halve the duration of pushing but double the power of the push, the object still gets pushed. If a ton of photons enter your eye over a femtosecond, it'll cause your rods and cones to flare up, much as it would if the same photons entered over a millisecond.
 
Last edited:
To illustrate what I'm claiming is going on (quite roughly).

If this is what you're looking at (slowed down by a factor of a couple hundred):

OeWP9mD.gif


Then your visual system responds in such a way that something like this is what ends up at the signal output of the eyeball:

SEyB7zC.gif


Now imagine that this is what you're looking at:

S64TACt.gif


Now, here's the signal that your eyeball produces:

AQJyDIf.gif


Now you've got two temporally-smudged events that are near-simultaneous. Speed it up by a hundred or two or whatever, and can you decipher which event happened first? Not particularly. But can you still see see both events, which occured in spatially close proximity? Sure. So you've got these events - which were each only happening for like one one-thousandth of a second, and were that same duration apart from each other - and you can't actually distinguish their durations or which order they occurred in (we have nowhere near enough temporal resolution), but the fact that they occurred still gets trips the sensors in the eyeball to have recognized that events happened at certain spots in the visual field.
 
Last edited:
I already basically answered this.
yes mate I realize that (and I do understand fully what you're saying) but I just don't accept your answer, all parts of the body (including the eyes) have limits, they dont operate at infinite range, (whilst eyes do show 'afterglow' on bright lights etc, you will not see this if its too fast to see in the first place)
eg I doubt we can see a bright light if it was shone for 0.000000000000001 second
unless we have some expensive tech I doubt we could test this

I will think of another way
edit: perhaps with sound, would it be possible to hear something a billionth of a second long? I doubt it, no matter the volume.
edit2: thinking more, you wouldnt be able to hear it would you as a wave of that hz length is outside the humans range of hearing, IIRC lights are waves when they travel also, hmmm though perhaps I'm mixing this up

BTW WRT sound vs visual IIRC the nerves/brain pick up sound 10x faster than the eyes wrt to light
 
Last edited:
edit: perhaps with sound, would it be possible to hear something a billionth of a second long? I doubt it, no matter the volume.
edit2: thinking more, you wouldnt be able to hear it would you as a wave of that hz length is outside the humans range of hearing, IIRC lights are waves when they travel also, hmmm though perhaps I'm mixing this up
We can't hear sounds higher than 20kHz (or lower than 20Hz) because the wave itself has a frequency that our ears cannot detect.

The equivalent phenomenon with visual receptors is the energy of a photon. A photon with a given energy has a given wavelength, and we can see photons whose wavelength is within the visual range. If a bunch of photons with an energy of 2.5eV (600THz) are hitting your eyes, you see greenish light. If you double the amount of photons that are hitting your eyes, you see brighter greenish light.

What you're mixing up is wave frequency versus event distribution frequencies. In terms of us being able to hear or see something, the wavelength of a sound wave is analogous to the wavelength of a photon. It is not analogous to the frequency at which bursts of photons are entering the eye. Actually, individual visual receptors in the eye can pick up individual photons (although a number need to enter in close temporal proximity for it to actually register a big enough response across your systems to be consciously "seen").

//================

If white light reaches your eyes in 50% duty cycle bursts at 1MHz, you see white light at half the brightness of the peak brightness.

The time-domain way of describing this is that the low temporal precision of the eyes smudges away the pulsing, but the rods and cones still acknowledge the photons, because said photons still land within the passband of said rods and cones.

The frequency-domain way to put this is that your visual system DOES carry out a low-pass filter on the photon event distribution waveform ("low temporal precision of the eyes"), but because there's no "negative" component of said waveform, there'll be a nonzero 0Hz component which remains in the output of said low-pass filter.
 
Last edited:
eg I doubt we can see a bright light if it was shone for 0.000000000000001 second
unless we have some expensive tech I doubt we could test this
Can you guys take this discussion to PM please. It's gotten way too hypothetical. We're dealing with 30/60/120/144 Hz displays, and thinking about imaginary 500+ Hz displays.
 
It's all diminishing returns, but some real life data:

At 120fps you can still see blurring on some content fairly easily. I tested this on quite a lot of TVs a few years back by running the same scene side by side on a 120fps input monitor and a 60fps input TV with frame rate conversion. On a scene which the TV's frame rate conversion can do a good job on (eg: a steady panning photo) the TV interpolation up to 240fps was sharper. So - your eye can perceive 240fps being less blurred than 120fps.

Furthermore, there were a few TVs that claim silly high interpolated framerates (eg: 600Hz). I believe they weren't actually updating the whole image at 600fps but rather updating part of the screen every 1/600th of a second they did seem to produce a crisper image still. It's hard to separate out what benefit came from the higher frame rate as opposed to a faster responding screen (ghosting due to the black -> white -> black response time curve makes quite a big difference in perceived blur) but since they could create a sharper image than a straight 240fps full frame update it suggests to me that there's still some blur to be removed over 240fps.

Very much into the realms of small differences at that point but I suspect the sensible cut off point for anything on a 1080p screen at a sensible viewing distance is probably somewhere around 300fps. In reality 90/120fps is pretty damn good for any real content as anything moving fast enough to still be blurred leaves the screen so fast your brain doesn't have time to work out what it was anyway. The weird thing is that it's actually quite hard to see blur at high framerates until you put another display next to it that's better. Then the contrast in sharpness makes it obvious.

With a controlled scene like a steadily scrolling testcard or text your brain can see the blur much higher than most people would guess. Part of it's that you still get silly people saying things like the human eye can only see 24fps, etc which is just nonsense. Not sure where it came from but I've heard it loads of times.
 
Last edited:
At 120fps you can still see blurring on some content fairly easily. I tested this on quite a lot of TVs a few years back by running the same scene side by side on a 120fps input monitor and a 60fps input TV with frame rate conversion. On a scene which the TV's frame rate conversion can do a good job on (eg: a steady panning photo) the TV interpolation up to 240fps was sharper. So - your eye can perceive 240fps being less blurred than 120fps.

Furthermore, there were a few TVs that claim silly high interpolated framerates (eg: 600Hz). I believe they weren't actually updating the whole image at 600fps but rather updating part of the screen every 1/600th of a second they did seem to produce a crisper image still. It's hard to separate out what benefit came from the higher frame rate as opposed to a faster responding screen (ghosting due to the black -> white -> black response time curve makes quite a big difference in perceived blur) but since they could create a sharper image than a straight 240fps full frame update it suggests to me that there's still some blur to be removed over 240fps.
This is sample-and-hold blur. It's caused by the rotational motion of your eyes relative to each still frame when you're eye-tracking something. Higher framerates help because they reduce the per-frame duration of relative motion. Pulsing the image (CRT displays, black frame insertion) is another way to minimize it.
 
Yup - you've generally got some combination of 4 things causing blur on a display - you need to fix all of them:

* sample and hold smudging = blurring equal to the distance between where your brain expects the object to be and where the screen is showing it to be. Your brain interpolates the object correctly if the screen doesn't get in the way (which is why black frame insertion and flashing displays like laser/CRTs look better).

* flashing blur = a compromise used by plasma displays that's somewhere between sample and hold and a CRT flashing screen at low fps. They tend to flash the same frame multiple times which does reduce the blur relative to sample and hold (because there's no persistent display of the object at the "wrong" location) but still leaves some blurring because they still show the object at the wrong location during each flash.

* judder = blurring due to uneven cadences of displayed frames relative to input framerate. This tends to turn up when you're trying to satisfy a vsync. A common example is something like displaying a 24fps movie or a game rendering at ~25fps on a display showing 60 frames/sec. The only way to do it is to do something like show the first frame 3 time units, then the second frame for 2 time units, etc. This looks terrible and a repeating, uneven pattern makes your brain really sensitive to the amount of blurring.

* ghosting = display defect leaving trails behind bright objects. Caused by the fall time of a pixel in the display going from full white -> full black. Response time on some LCD displays is really poor. Regardless, even on an infinite FPS display you'll see blurred trails behind bright objects equal to the response time of the LCD.
 
* sample and hold smudging = blurring equal to the distance between where your brain expects the object to be and where the screen is showing it to be. Your brain interpolates the object correctly if the screen doesn't get in the way (which is why black frame insertion and flashing displays like laser/CRTs look better).
It's not a brain interpolation thing. It's smudging due to eyeball exposure. If you opened a camera shutter when a frame begins to be displayed, rotated the camera while the frame was being shown, and then closed the shutter as the frame switched to the next, the blur in the resulting image is more or less the same thing as sample-and-hold blur in human vision.

flashing blur = a compromise used by plasma displays that's somewhere between sample and hold and a CRT flashing screen at low fps. They tend to flash the same frame multiple times which does reduce the blur relative to sample and hold (because there's no persistent display of the object at the "wrong" location) but still leaves some blurring because they still show the object at the wrong location during each flash.
It's pretty much exactly the same thing as sample-and-hold, but with a flickered image, meaning the artifact resolves as separated ghosts rather than a continuous smudge.

judder = blurring due to uneven cadences of displayed frames relative to input framerate.
Uneven cadence doesn't in and of itself cause blurring.
 
Back
Top