The Great Simulated Optics Debate *spawn

But, as PS4 has room to spare to output 1080p60fps Kojima unwisely decided to use this gravy stuff to heavily motion blur the image each time you span even slowly the camera or just run. You really have to span the camera very very slowly or walk to not trigger this unwanted artificial blur. Nobody ever complained that Ground Zeroes lacked motion blur, right?

Motion blur a 60fps locked videogame? Heresy!

Why Kojima? Why? :???:

Motion blur > no motion blur :devilish:
 
Motion blur > no motion blur :devilish:

No. no. no. I strongly disagree. In all my videogame experiences I have never seen any advantage of a motion blurred image, even on 30fps games. Never. Quite the contrary.

Because the motion blur is added on top of the already "ghosting blurred" image. With only the "natural" ghosting blur from 30fps (or 60fps) I can still see details from the image if I want to follow the moving stuff. The artificial motion blur completely destroys the few high frequency details the image still have.

Some people like me can still see a good amount of details, even on quickly spanning 30fps game. For us the motion blur is a disturbance, a very noticeable anomaly.

On this MGS5 video I can easily see when stuff are motion blurred, like snake's legs and arms when he runs, his legs suddenly becomes some fuzzy hazy contraption! It's distracting and useless specifically on a 60fps game. But In fact I don't care about snake's body but I do care about the motion blurstroyed beautiful scenery each time you barely move the camera! I didn't hear you complain that Ground zeroes wasn't motion blurred enough! ;)

And what about people who have motion interpolation TVs? The motion blur will completely defeat the purpose of those expensive TVs because the interpolated frames will be constructed from some already destroyed (blurred) material.

If only because those TVs exist developers really should add an option to turn off any artificial cinematic blurs. Virtual Reality can't come soon enough! :LOL:
 
Well, I don't see why motion blur would have to disappear in 60fps games. Real life has a higher rate of information and yet we see motion blur all the time.

Ghosting isn't the same thing and if it's a problem in your setup I recommend you upgrade your TV :p

Motion interpolation isn't a problem since the game already runs at 60fps. All you'd get by turning it on is lag.
 
Well, I don't see why motion blur would have to disappear in 60fps games. Real life has a higher rate of information and yet we see motion blur all the time.

Ghosting isn't the same thing and if it's a problem in your setup I recommend you upgrade your TV :p

Motion interpolation isn't a problem since the game already runs at 60fps. All you'd get by turning it on is lag.

You don't actively "see" motion blur on any object you can focus you eyes on. Well, sorry but I definitely don't.

Motion blur is an optical effect than only exist in real life, well "in your brain", only if you don't follow actively with your eyes any moving object, including a moving object existing in a 3D scene watched on a 2D screen.

Thus motion blur will be automatically created by your brain if you don't focus on any quickly moving stuff on a videogame even if those images aren't artificially motion blurred.

If you follow with your eyes any moving object in real life you will never see any motion blur, it's completely unrealistic to see motion blur on a slowly moving stuff on a videogame when you can easily follow the object with your eyes. More so on a 60fps game where the ghosting blur is minimum and where even moderately quick moving stuff could be easily tracked and discerned with good eyes.

Artificially create motion blur in any videogame is realistically wrong whatever how you want to look at it, pun intended :cool:. It may have some significance only if you want to create artistically cinematic (unrealistic) effects like bloom, DOF or lens flares.

And about motion interpolation, what about 120fps motion interpolation on a 60fps game?
 
Motion blur happens in the brain for the same reason it happens in a camera: exposure time. You can avoid motion blur in real life even if you're not focusing on the moving object if the area is illuminated by a strobing light (common in clubs).

Motion blur in games doesn't happen naturally in games because all the brain is seeing are still frames, not a continuum of information, hence it needs to be added as a postprocessing effect. At best what you'll see is ghosting, which is a very different effect.

To be perfectly realistic, games would require to track the position of the player's eyes to properly add motion blur and DOF where needed. Then again, videogames cameras nowadays don't try to emulate eyes, but cameras so the way they handle these effects is perfectly appropriate.

Having the option to turn the effects off for the people who dislike them would be nice though.

As for 120hz, you'd still introduce lag. Also even if you had motion blur in the image there are other situations in which motion interpolation fails so you'd still get plenty of artifacts.
 
To be perfectly realistic, games would require to track the position of the player's eyes to properly add motion blur and DOF where needed. Then again, videogames cameras nowadays don't try to emulate eyes, but cameras so the way they handle these effects is perfectly appropriate.

And that, IMO, is why motion blur is a failure in games.

In a movie, I'm watching something and have no input. Hence, I'm generally looking where the director wants me to look. When I don't do that, the whole illusion breaks down because my eyes are unable to focus on something they would be able to focus on in real life.

In real life if I track a fast moving object (say a car moving at 150 mph) that object will be perfectly in focus while the surroundings are blurred. If I then focus on another object moving at a different speed that object is now in perfect focus while the the original car is now blurred relative to what I am tracking.

In a game without motion blur there will be some natural blur induced if I track that object with my eyes without moving the in game camera. If I move the camera everything goes stuttery/blurry depending on how often the image is refreshed in game but whatever I'm focused on still remains relatively sharp. The lower the frame rate (30 fps for example) the more stuttery/blurry/objectionable the effect. And at any point I can stop the camera and focus on a moving object and it will be in relatively sharp focus, minus the stuttering due to low framerate.

With motion blur in a game I have the absolute worst of all worlds.

"I" am the director in a game. "I" get to choose what should be focused upon, exception being cinematics or heavily HEAVILY scripted scenes. However, "I" don't get to choose what gets motion blur and what doesn't if the game implements it. And that, IMO, is a huge HUGE problem. I go to track a fast moving object where the game developer has chosen to apply motion blur? Tough luck, it's blurry even though I'm focused on it.

Hence, I will not play any game where I cannot turn off the motion blur in game. Which also means I won't play any game on console that features motion blur if I can help it. Thank goodness for PC.

I don't play games to have my eyes tortured by a blurry mess when I focus my eyes on them. I had to live with that without glasses before I had laser eye surgery. I'm not keen on reliving the days when I couldn't focus on something I was looking directly at.

That said, I always leave motion blur enabled for cutscenes when the option is there. As then I'm basically watching a short film anyway.

Motion blur during gameplay, however? EPIC fail. Far worse than something like bloom or chromatic aberrations due to simulating a camera lens. At least then the game still respects the fact that "I" am the director of the game while playing. And even though I'm basically looking through a camera, "I" am still in control and "I" get to determine what my eyes are focused on.

Now, if technology eventually comes to gaming where the game can track where I'm looking and intelligently apply motion blur? I'd probably be OK with that.

Yes, I absolutely hate motion blur during gameplay with a passion.

Regards,
SB
 
In a movie, I'm watching something and have no input. Hence, I'm generally looking where the director wants me to look. When I don't do that, the whole illusion breaks down because my eyes are unable to focus on something they would be able to focus on in real life.
That was a real breaker for 3D. If you see a blurred object in the foreground and try to focus on it, it remains blurred, and the whole situation feels fake.
 
You don't actively "see" motion blur on any object you can focus you eyes on. Well, sorry but I definitely don't.
Bring your hand up and casually shake it in front of you (nothing intense), you'll see real life object motion blur. It's because your eyes can't focus that fast. You slowly move your had and focus on it, the background gets blurred..that's real life camera motion blur for you. Games try to replicate this with post processing.


Motion blur happens in the brain for the same reason it happens in a camera: exposure time. You can avoid motion blur in real life even if you're not focusing on the moving object if the area is illuminated by a strobing light (common in clubs).

Motion blur in games doesn't happen naturally in games because all the brain is seeing are still frames, not a continuum of information, hence it needs to be added as a postprocessing effect. At best what you'll see is ghosting, which is a very different effect.

To be perfectly realistic, games would require to track the position of the player's eyes to properly add motion blur and DOF where needed. Then again, videogames cameras nowadays don't try to emulate eyes, but cameras so the way they handle these effects is perfectly appropriate.

Having the option to turn the effects off for the people who dislike them would be nice though.

As for 120hz, you'd still introduce lag. Also even if you had motion blur in the image there are other situations in which motion interpolation fails so you'd still get plenty of artifacts.

Out of curiosity, how do movies get their motion blur then? They don't add it later obviously.
 
And that, IMO, is why motion blur is a failure in games.

In a movie, I'm watching something and have no input. Hence, I'm generally looking where the director wants me to look. When I don't do that, the whole illusion breaks down because my eyes are unable to focus on something they would be able to focus on in real life.

In real life if I track a fast moving object (say a car moving at 150 mph) that object will be perfectly in focus while the surroundings are blurred. If I then focus on another object moving at a different speed that object is now in perfect focus while the the original car is now blurred relative to what I am tracking.

In a game without motion blur there will be some natural blur induced if I track that object with my eyes without moving the in game camera. If I move the camera everything goes stuttery/blurry depending on how often the image is refreshed in game but whatever I'm focused on still remains relatively sharp. The lower the frame rate (30 fps for example) the more stuttery/blurry/objectionable the effect. And at any point I can stop the camera and focus on a moving object and it will be in relatively sharp focus, minus the stuttering due to low framerate.

With motion blur in a game I have the absolute worst of all worlds.

"I" am the director in a game. "I" get to choose what should be focused upon, exception being cinematics or heavily HEAVILY scripted scenes. However, "I" don't get to choose what gets motion blur and what doesn't if the game implements it. And that, IMO, is a huge HUGE problem. I go to track a fast moving object where the game developer has chosen to apply motion blur? Tough luck, it's blurry even though I'm focused on it.

Hence, I will not play any game where I cannot turn off the motion blur in game. Which also means I won't play any game on console that features motion blur if I can help it. Thank goodness for PC.

I don't play games to have my eyes tortured by a blurry mess when I focus my eyes on them. I had to live with that without glasses before I had laser eye surgery. I'm not keen on reliving the days when I couldn't focus on something I was looking directly at.

That said, I always leave motion blur enabled for cutscenes when the option is there. As then I'm basically watching a short film anyway.

Motion blur during gameplay, however? EPIC fail. Far worse than something like bloom or chromatic aberrations due to simulating a camera lens. At least then the game still respects the fact that "I" am the director of the game while playing. And even though I'm basically looking through a camera, "I" am still in control and "I" get to determine what my eyes are focused on.

Now, if technology eventually comes to gaming where the game can track where I'm looking and intelligently apply motion blur? I'd probably be OK with that.

Yes, I absolutely hate motion blur during gameplay with a passion.

Regards,
SB

Thank you for this post. Thank you for having taken so much energy in this great "movie director" analogy in order to make our point better understood here. But a lot of explanation will still be necessary I am afraid.

nightshade said:
Out of curiosity, how do movies get their motion blur then? They don't add it later obviously.
Motion blur in movies is a bad visual artefact because of technical limitation from 20th century cameras. Game directors love to reproduce it in videogames because of nostalgia/artistic value, not because it's realistic.

nightshade said:
Bring your hand up and casually shake it in front of you (nothing intense), you'll see real life object motion blur. It's because your eyes can't focus that fast. You slowly move your had and focus on it, the background gets blurred..that's real life camera motion blur for you. Games try to replicate this with post processing.
I perfectly know what is motion blur or how to "see" it in real life, thank you. Please re-read my previous post and Silent_Buddha's post to better understand our similar opinion on how motion blur is currently used in a very wrong way in the videogame industry, the worst way, during gameplay.
 
Motion blur in movies is a bad visual artefact because of technical limitation from 20th century cameras. Game directors love to reproduce it in videogames because of nostalgia/artistic value, not because it's realistic.
That's an inaccurate generalisation. Motion blur should not be applied to what you're looking at, but should be present at what you're not looking at. Its exclusion from games is just as unrealistic as its inclusion. eg. In a car game, the scenery around you that you aren't tracking will be blurred. If you don't apply blur in a game, you'll get a fast shutter camera effect rather than a real-life driving effect.

One can argue for or against application of moblur in games, but to claim it's unrealistic is inaccurate and leads to these repetititious arguments. Moblur is perfectly natural. The only correct solution is very high framerate games. Until we have that, the choice to add or exclude moblur from a game is down to the developers and the appreciation will be subjective. You may not like it, but it isn't categorically 'wrong' or 'unnatural'.
 
That's an inaccurate generalisation. Motion blur should not be applied to what you're looking at, but should be present at what you're not looking at. Its exclusion from games is just as unrealistic as its inclusion. eg. In a car game, the scenery around you that you aren't tracking will be blurred. If you don't apply blur in a game, you'll get a fast shutter camera effect rather than a real-life driving effect.

One can argue for or against application of moblur in games, but to claim it's unrealistic is inaccurate and leads to these repetititious arguments. Moblur is perfectly natural. The only correct solution is very high framerate games. Until we have that, the choice to add or exclude moblur from a game is down to the developers and the appreciation will be subjective. You may not like it, but it isn't categorically 'wrong' or 'unnatural'.

Sorry but you still don't really understand what Silent_Buddha and myself are painfully and patiently trying to explain: no need to create artificial motion blur or even DOF on a videogame even at what you're not looking at (if we could in the future by some eyes tracking device) because the brain will always automatically create it for you, most particularly on a 60fps game like MGS5.

Unfortunately we'll have repetitive arguments until everybody really understand the whole motion blur and DOF cinematic nostalgic legacy, how wrongly they are used in the videogame industry and the problem they bring against realism.

To see motion blur (or DOF) during active gameplay even on objects I decide to focus my eyes on is completely unrealistic, more so now with the advent of motion interpolation TVs. It shouldn't be replicated on a videogame medium except for movie nostalgia, artistic purposes or in a cinematic cut-scene where the game director wants us to see only parts of the image, like the character's faces.

Realistic motion blur, in real life, is only an illusion created by our brain, it has been such "brain illusion" during millions of years. Artificial motion blur shown on physical stuff (TV screens) has been created by accident as a byproduct of deficient optical devices (with DOF, chromatic aberrations, lens flares, bloom etc.) and then wrongly replicated (along with DOF, chromatic aberrations, lens flares and bloom) from the example of those deficient devices in the videogame industry.
 
Just because you keep repeating your argument in an affirmative and condescending tone it doesn't mean it will suddenly become true.

Motion blur in games isn't created by the brain regardless of framerate. I already explained this to you but you chose to simply ignore me.

Try to educate yourself a little instead of going around spouting nonsense.
 
Your eyes cannot just "create" DOF in video games when it's not there because there is no depth in the first place which is why you need post processing to visually portray it on screen, your argument would hold true if you were talking about VR or 3D because in that case your eyes naturally do the job since there is depth to the information available to your. It's the same with motion blur, your eyes cannot create it because the information your eyes receive (even at 60FPS or 120FPS) isn't enough to create a motion blur.


In FPS with motion games when you turn the camera for example the gun stays in focus but the environment is blurred, I think that's perfectly realistic. Likewise for situations when say you switch a gun and the gun is blurred from object motion blur but the environment is in focus. I think that's perfectly realistic as you cannot focus on both at the same time and that is how it works even in TPS games and that's how it works in real life. Your eyes cannot just create that information out of thin air even at 60FPS because at the end of that day what you are seeing is 60 still frames which is nothing compared to the information your eyes receive in real life.
 
Last edited by a moderator:
maybe human brain detect motiino blur different ways?

because i do see motion blur in games but i always disable motion blur (more like, just blur). because with motion blur, i cant play very long before i get unconfortable.

for example while writing this reply, i see the reply box clearly but the other areas is blur.
 
Sorry but you still don't really understand what Silent_Buddha and myself are painfully and patiently trying to explain
:rolleyes: Silent_Buddha even agrees with me!

Silent_Buddha said:
In real life if I track a fast moving object (say a car moving at 150 mph) that object will be perfectly in focus while the surroundings are blurred. If I then focus on another object moving at a different speed that object is now in perfect focus while the the original car is now blurred relative to what I am tracking.
Motion blur exists in everywhere except what you are tracking if you can track it fast enough (and there are limits to human eye tracking speed, but we can ignore those for the purposes of games).

...because the brain will always automatically create it for you
Real life motion blur is produced from cumulation of continuous light samples over a period of time. 'Motion blur' from a TV screen is produced from cumulation of discontinuous light samples at 30/60 fps. The end result is not the same and is clearly visually different. Case in point - wave your hand quickly in front of your eye. The fingers blur, unless you have super-eyes that can track the fingers at that considerable speed. With a waving hand rendered at 60 fps, you'd see not a smearing of fingers across space, but a discrete set of finger images.

It is not possible to get natural motion blurring in games with current tech. We either need selective motion blur + DOF using eye tracking, or really fast refreshes. Until then, the choice to blur a game's graphics so it more closely represents real life optics (whether camera or human vision) is down to devs, and subjective.
 
Your eyes cannot just "create" DOF in video games when it's not there because there is no depth in the first place which is why you need post processing to visually portray it on screen, your argument would hold true if you were talking about VR or 3D because in that case your eyes naturally do the job since there is depth to the information available to your. It's the same with motion blur, your eyes cannot create it because the information your eyes receive (even at 60FPS or 120FPS) isn't enough to create a motion blur.

In FPS with motion games when you turn the camera for example the gun stays in focus but the environment is blurred, I think that's perfectly realistic. Likewise for situations when say you switch a gun and the gun is blurred from object motion blur but the environment is in focus. I think that's perfectly realistic as you cannot focus on both at the same time and that is how it works even in TPS games and that's how it works in real life. Your eyes cannot just create that information out of thin air even at 60FPS because at the end of that day what you are seeing is 60 still frames which is nothing compared to the information your eyes receive in real life.
One correction: DOF is an optical effect, not a perceptual effect, meaning your brain isn't responsible for it, but the lenses in your eyes. The screens in a VR headset are at a fixed distance so everything on the screen is at the same focus therefore you need postprocessing to create the DOF effect. The perceptual cues of depth come from parallax effects.

maybe human brain detect motiino blur different ways?

because i do see motion blur in games but i always disable motion blur (more like, just blur). because with motion blur, i cant play very long before i get unconfortable.

for example while writing this reply, i see the reply box clearly but the other areas is blur.

If the screen is blurry while the image is static the problem is definitely with your TV/monitor.
 
Bring your hand up and casually shake it in front of you (nothing intense), you'll see real life object motion blur. It's because your eyes can't focus that fast. You slowly move your had and focus on it, the background gets blurred..that's real life camera motion blur for you. Games try to replicate this with post processing.

However, if you track your hand's motions your hand will remain perfectly in focus. The background, of course, will then be a blurry mess due to a combination of relative motion and being on a different focal plane. Or if you focus on the background, then your hand is a blurry mess for the same reasons.

Here's the problem with games. If you implement that, the games developer has no way to determine which you are doing. So will get it wrong approximately 50% of the time. Sometimes those percentages are better in heavily scripted scenes, and sometimes those percentages will be worse. Meaning sometimes it'll look correct in game, and sometimes it will look horribly, HORRIBLY wrong.

As as I said in films it "mostly" works because we're looking where the director wants us to look. There is no interaction possible from the viewer, so most of the times, people look where the director wants them to look if the director does a good job. If they don't people's eyes will wander and the illusion breaks.

In games, it almost never works, because it IS interactive. And thus the player always has the option to look around the scene to asses the situation (enemies, hazards, entrances, exits, where to move to next, etc.) If anything has motion blur or DOF applied to it then the illusion is immediately broken as soon as the player looks at it directly. For example, in a shooter game when scanning the battlefield for targets or a driving game checking for alternative routes.

IMO (and I realise this whole topic of simulated motion blur is highly opinion based), motion blur and DOF (with a few exceptions like looking down the iron sights of a gun for example) are far far more harmful in breaking the illusion than not having any motion blur or DOF. At least until such time that there is a system in place that can somewhat reliably track where the player is looking so that Motion blur and DOF can be intelligently applied.

Until that happens, however, everytime I look directly at something and what I'm looking at is a blurred mess due to motion blur or DOF, I will always hate it because it not only reminds me of when I had bad vision, but it also makes my eyes hurt as they strain to bring it into focus but are unable to.

I know people in RL that suffer headaches due to trying to bring a motion blurred or DOF blurred object into focus when they look directly at the object. Their brain can't handle the fact that the object cannot be brought into focus no matter how hard it (and the eyes) tries and it continues to try no matter what.

Regards,
SB
 
edit2: whoops, looks like I replied while missing a few posts, apologies!

I think there is a big misunderstanding regarding motion blur.
I don't think Silent_Budda is referring to high quality, velocity based object motion blur.

I mean: Silent_Budda, hold your hand in front of your face and wave it around.
What you see if high quality retina grade velocity based object motion blur. (Unless you are waving your hand in front of your old 50hz crt monitor)

So when Snake is running/sprinting viewed from the side; the outside parts of his legs are moving faster than the rest, in reality the outside parts of his legs would not appear sharp, so having them appear sharp as Silent_Budda wants, would actually clash with reality as, unless the sun explodes and is replaced by an artificial low frequency strobe light; legs of running people will always have reality blur applied to them.

If you have access to a children playground Silent_Budda, go sit on a carousel and have some children turn it around while you stare outwards.
Is the background perfectly sharp? Or might it have.... blur applied to it?
I am guessing the latter.

So in conclusion: thank great developers for excellent implementations of good motion blur.

When we have 500hz displays and games rendering at 500fps, then I agree; motion blur is not needed anymore

edit: devils advocate; it is possible to follow a fast moving object; if only for a really short period of time. Like for example you can sometimes see parts a fast spinning wheel perfectly sharp because for a fraction of a second; your eyes move at the same speed.
In the carousel example, it could be possible to focus on a distant tree, although that would require the ability to turn your head as well, as it will only appear sharp for a very short fraction.
 
Last edited by a moderator:
However, if you track your hand's motions your hand will remain perfectly in focus.
You can't track your hand's motions. At least I can't. Even a fairly gentle wave against a window I cannot track the fingers and they blur. I have to slow the wave right down to a slow rotation to be able to fix upon a finger.

Here's the problem with games. If you implement that, the games developer has no way to determine which you are doing. So will get it wrong approximately 50% of the time...

In games, it almost never works, because it IS interactive.
That's where things get very subjective. If you are tracking a slow moving player and suddenly the move there'll be some blurring. A quick Google suggest 150ms response time which is mostly mitigated by predictive tracking. A sudden change means 150ms of blur before the eye can latch onto the subject. Also if a dude is running towards you waving his arms around he may be in focus but his arms will be blurred. That's lots of blur going on in real life that's just ignored. It only breaks in game where it's applied to excess where you're not looking but a well designed game should be able to limit it to periphery and certain objects such that it's not wrong 50% of the time.

And thus the player always has the option to look around the scene to asses the situation (enemies, hazards, entrances, exits, where to move to next, etc.) If anything has motion blur or DOF applied to it then the illusion is immediately broken as soon as the player looks at it directly. For example, in a shooter game when scanning the battlefield for targets or a driving game checking for alternative routes.
When scanning the battlefield most targets are so far away as to make moblur a non-issue. The eye is mostly looking for movement anyhow and a large splotch of moblur would make a target easier to spot than the same target in discrete pixels.

Until that happens, however, everytime I look directly at something and what I'm looking at is a blurred mess due to motion blur or DOF, I will always hate it because it not only reminds me of when I had bad vision, but it also makes my eyes hurt as they strain to bring it into focus but are unable to.

I know people in RL that suffer headaches due to trying to bring a motion blurred or DOF blurred object into focus when they look directly at the object. Their brain can't handle the fact that the object cannot be brought into focus no matter how hard it (and the eyes) tries and it continues to try no matter what.
I can agree with that which is why it should be a setting. Design the game with blur effects for performance reasons then have the option to disable them. Not sure how easy that is to implement in unified monolithic postFX shaders though.
 
Back
Top