The Great Simulated Optics Debate *spawn

Games don't simulate what our eyes see, they simulate what a camera sees. People still haven't figured this out after all these years of 3D games?

The only time I've seen the "what your eyes see" discussion come up is in regards to the Oculus Rift, which actually does simulate actual human vision, so devs avoid things like motion blur and lens flare like the plague.

As Shifty said they can do both.
One example of human eye behavior "simulated" in games it's eye adaptation and I have seen also flash blindness in shooters.
 
The other part that you describe is a different kind of blur and it's not motion blur.
I described motion blur. Motion blur is movement across the retina. Whether that's because the retina is still and the object moving, or the object still and the retina moving, it's the same phenomenon. In my example, the crowd is static relative to the retina because you are tracking its movement while the car is in motion as it's moving across the retina in the opposite direction.

HTupolev said:
Adjusting the degree of blur according to distance from camera is extremely common.
This comes down to our requirement to talk about vision in terms of angles. Near objects move relatively faster than more distant objects. Resolution and motion and anything else related to the eye should be based in angular/polar space rather than linear space!

One example of human eye behavior "simulated" in games it's eye adaptation...
You can have camera's with auto exposure that give the same result.
 
I described motion blur. Motion blur is movement across the retina. Whether that's because the retina is still and the object moving, or the object still and the retina moving, it's the same phenomenon. In my example, the crowd is static relative to the retina because you are tracking its movement while the car is in motion as it's moving across the retina in the opposite direction.]

In your example the car is not moving...and you can't track the crowd and it's pointless to try anyway when you're racing...

You may look in the direction of the crowd but you'd be an idiot to try and track them for no apparent reason or benefit as they'd be whizzing by at over 100 MPH. Unless you're on a sightseeing tour driving at a snails pace it's utterly pointless.
 
In your example the car is not moving...and you can't track the crowd and it's pointless to try anyway when you're racing...

You may look in the direction of the crowd but you'd be an idiot to try and track them for no apparent reason or benefit as they'd be whizzing by at over 100 MPH. Unless you're on a sightseeing tour driving at a snails pace it's utterly pointless.
It's a hypothetical scenario to illustrate exactly how motion blur is experienced. Replace 'car' with 'horse' or 'football' or whatever else. If you are tracking the moving object, the static scenery is blurred. If you switch to look at the static scenery, the moving object is blurred.

Are we going to reach any point in this discussion where people actually understand what motion blur is, how it's present in human optics as well as video, and where it fits into games, or am I going to be spouting examples and references ad infinitum?
 
Are we going to reach any point in this discussion where people actually understand what motion blur is, how it's present in human optics as well as video, and where it fits into games, or am I going to be spouting examples and references ad infinitum?

Doubtful :p. There are some good insights in this thread but buried under so much noise and misinformation I don't think anyone could gain any useful knowledge on the subject unless they already understood motion blur and why and how it should be in games.

Of course the sentiment "I hate motion blur and it should not be in games" is not unfounded. Nearly no developers seem to understand how it should work either, so we get these awful "blur everything that moves!" jobs..
 
Last edited by a moderator:
Sorry but that's the only conclusion I can come to after seeing what they did in Mass Effect :D
 
It's a hypothetical scenario to illustrate exactly how motion blur is experienced. Replace 'car' with 'horse' or 'football' or whatever else. If you are tracking the moving object, the static scenery is blurred. If you switch to look at the static scenery, the moving object is blurred.

Most people here on the thread have shown that they already know all those basics. Those who have not done their homework before generating noise here can just be selectively ignored for now.
 
Yeah but usually devs say they want to reproduce eye adaption not camera auto exposure.

With regards to that, I've seen more than one dev use the expression "eye adaptation" to describe their HDR exposure control and TODing, while all the math and theory shown subsequently to justify it is based on Film production and Camera technology know-how. Eye-adaptation is just a word here, in all truth.
 
This comes down to our requirement to talk about vision in terms of angles. Near objects move relatively faster than more distant objects. Resolution and motion and anything else related to the eye should be based in angular/polar space rather than linear space!
If we're trying to pinpoint which coordinate systems most directly describe the expected motion blur, sure.

If we're trying to discuss implementation and motion estimation, it's often useful to speak in other terms. The notion of motion parallax varying by distance is a perfectly useful concept in these conversations.
(And it's still a perfectly valid concept when using polar coordinates, even if representing it can sometimes be strange. Particularly since, at the end of the day, "good" representations for all this stuff are always going to be a bit wonky if you want to correctly account for the warping of perspective projection and whatnot.)
 
I expect it means 1x the distance moved per frame.

incidentally, 3 more real-world examples of human visual motion blur that I noticed today

1) Playing the piano. If you look at the keyboard, your hands are a blur. It's even a saying of spectators who'll claim a performer's fingers so fast 'they're a blur'.

2) At a junction. If you look at the traffic lights, cars passing across in front of you are blurred, and if you track those cars, the traffic lights are blurred. This one's easy to spot.

3) Urinating. Urine looks like a continuous stream but, as Mythbusters demonstrate with high-speed cam, it's made of lots of discrete blobs. It only appears as one continuous stream because it's moving too fast and all those blobs are blurring together.

That should definitely convince the last of the doubters that motion blur exists in the human visual experience outside of TV and movies, and hence is an effect that it's fair to try and included in game rendering.
 
Open question, and sorry if I ask, but why is "1.0 motion blur" called realistic there?
What is the parameter the determines how "realistic" motion blur is in games/movies?
1.0 motion blur means the image contains everything that happened during 1/24th second or whatever the frame rate. It's like the infinity of frames in-between were merged together. So if a bright point is moving 100 pixels between frames, a 1.0 motion blur will show a perfect line of 100px, and the next frame will have a line that continues another 100px exactly where the previous frame ended. Less than 1.0 will show a dotted line (missing information), more than 1.0 will show overlapped lines (multiple frames combined). It's realistic because it's mathematically correct, no added information, no loss of information.

For feature films, 0.5 motion blur is the norm (180 degrees shutter). It's pretty much what most people are used to, and it's a good compromise that will probably stay as long as we're stuck at 24fps. With higher frame rates I think 1.0 becomes more interesting.
 
Last edited by a moderator:
Ok but I mean how do we determine if the amount of motion blur applied in a game/movie its realistic or better truthful to human vision?
Is there a measured average of motion blur perceived by individuals to use as a reference or developers/filmmakers can apply any amount of motion blur "arbitrary" following mainly their own judgement or artistic sense?
 
Last edited by a moderator:
Ok but I mean how do we determine if the amount of motion blur applied in a game/movie its realistic?
Is there a measured average of motion blur perceived by individuals to use as a reference or developers/filmmakers can apply any amount of motion blur "arbitrary" following mainly their own judgement or artistic sense?
That's a tricky issue. At any given framerate there's a certain motion blurring that will capture the full range of motion of an object, and there's a certain motion blurring (perhaps subjective) that will make the result most closely resemble real life. These two things do not necessarily coincide; at very low framerates, the former approach dictates using HUGE sweeps of blur, which is probably why film tends to shutter for only half the duration of a frame; strictly speaking, you lose information, but the result doesn't look so "argh blur NOOOOOOO."
 
Ok but I mean how do we determine if the amount of motion blur applied in a game/movie its realistic or better truthful to human vision?
You can't. Human vision doesn't experience discrete samples, so you won't get a blurred interpolation of of two frames advancing frame to frame. eg. Consider a white dot moving 100 px. With 1x motion blur, it is a line 100 pxs long. The next frame, it's a line the same length moved 100 pixels - that's a discrete motion jump that wouldn't exist in the real world. The RL blur would consist of 100 'pixels' of blur moving at a constant speed.

As an approximation, 1x frame is good enough. 2x will give more coherent motion. I'd say it's up to the devs to pick a target compromise. It's a balancing act between a set of criteria that's impossible to resolve to a perfect union, so the final choice is ultimately going to be pretty arbitrary. Less blur = more clarity when needed, more jerkiness when not. More blur = more smoothness where it should be but loss of detail where it shouldn't be.

The most scientific solution would be a poll of gamers voting on which of a set of test cases they prefer and pick the most voted settings as that pleasing to the largest audience. Alternatively, leave it to your lead artist and make it part of the style of the game.
 
Back
Top