To be perfectly realistic, games would require to track the position of the player's eyes to properly add motion blur and DOF where needed. Then again, videogames cameras nowadays don't try to emulate eyes, but cameras so the way they handle these effects is perfectly appropriate.
And that, IMO, is why motion blur is a failure in games.
In a movie, I'm watching something and have no input. Hence, I'm generally looking where the director wants me to look. When I don't do that, the whole illusion breaks down because my eyes are unable to focus on something they would be able to focus on in real life.
In real life if I track a fast moving object (say a car moving at 150 mph) that object will be perfectly in focus while the surroundings are blurred. If I then focus on another object moving at a different speed that object is now in perfect focus while the the original car is now blurred relative to what I am tracking.
In a game without motion blur there will be some natural blur induced if I track that object with my eyes without moving the in game camera. If I move the camera everything goes stuttery/blurry depending on how often the image is refreshed in game but whatever I'm focused on still remains relatively sharp. The lower the frame rate (30 fps for example) the more stuttery/blurry/objectionable the effect. And at any point I can stop the camera and focus on a moving object and it will be in relatively sharp focus, minus the stuttering due to low framerate.
With motion blur in a game I have the absolute worst of all worlds.
"I" am the director in a game. "I" get to choose what should be focused upon, exception being cinematics or heavily HEAVILY scripted scenes. However, "I" don't get to choose what gets motion blur and what doesn't if the game implements it. And that, IMO, is a huge
HUGE problem. I go to track a fast moving object where the game developer has chosen to apply motion blur? Tough luck, it's blurry even though I'm focused on it.
Hence, I will not play any game where I cannot turn off the motion blur in game. Which also means I won't play any game on console that features motion blur if I can help it. Thank goodness for PC.
I don't play games to have my eyes tortured by a blurry mess when I focus my eyes on them. I had to live with that without glasses before I had laser eye surgery. I'm not keen on reliving the days when I couldn't focus on something I was looking directly at.
That said, I always leave motion blur enabled for cutscenes when the option is there. As then I'm basically watching a short film anyway.
Motion blur during gameplay, however? EPIC fail. Far worse than something like bloom or chromatic aberrations due to simulating a camera lens. At least then the game still respects the fact that "I" am the director of the game while playing. And even though I'm basically looking through a camera, "I" am still in control and "I" get to determine what my eyes are focused on.
Now, if technology eventually comes to gaming where the game can track where I'm looking and intelligently apply motion blur? I'd probably be OK with that.
Yes, I absolutely hate motion blur during gameplay with a passion.
Regards,
SB