Graphical effects that are standard by now but shouldn't

How motion blur really feels depends a lot on framerate and lenght of it.
Movies usually are 24fps with half lenght blur. (shutter open for 1/48hz.)

If game has low framerate and open shuter the image can feel very blurry even with a smallest movement.
In 60fps game the fully open shutter doesn't really feel bad, image is crisp enough and it gives nice feel to the movement.
 
Have you actually played TLoU on PS3? The motion blur looks like iffy from any "cinematicness" perspective, but is IMO one of the most successful implementations I've seen for making a poorly-performing game feel smoother.

I'd actually agree with your gist for a lot of implementations, but you've chosen the worst example possible.

It's maybe smoother but it's not anymore a game deserving to be played on a PS3, more like a gameboy advance game upscaled to 720p (except for the main character). Everything (details, textures, geometry etc.) is destroyed (and here the camera is only slowly moving around Joel, it can be much worse).

A smooth and judder-free gameboy advance game then.

Any motion blur brings more bad than good in the image IMO. Some people will prefer having the good. But in reality many more players hate it. CA brings some good in the image too. Some people think it reduces aliasing (or whatever) or make the experience more cinematic / artsy/ adding "depth", in a good way. The same can be said to DOF on background scenery that can hide low resolution assets or make it more artsy. I am talking here gameplay only, obviously in cut scenes DOF can have others useful functions like focusing attention to one character. But what interest me here is gameplay interactive scenes only.

Motion blur is in fact not different than Chromatic aberration or Depth of field (why the devs really use it), because it's an ugly cinematic effect (only because of old and dated technology, those 3 effects could now be almost totally eradicated from movies if directors really wanted it, but they don't obviously for nostalgia /commercial/ marketing and cost reasons) that can fortunately add some good stuff in the image. People are not too complaining only because they are used to the "ugly" part watching movies and think this ugly is normal / acceptable.

They think it's the price to pay if they want to play a 2015 videogame, that devs haven't got the choice so gamers have to do the same. I think people are more complaining about CA because there aren't used to strong CA in movies. But if CA level 10 (as seen in Bloodborne) was (for whatever reason) the movie industry standard, I am sure they'd complain a lot less and unconsciously gladly accept it.
 
Last edited:
i cant stand motion blur. It makes me got headache.
but motion blur onlyfor certain stuff is fine. Like when taking damage, when thrown away super high speed.

worse offender is FF Type 0 motion blur. My headache makes me unable to go pass the 1st robot and i just uninstalled it.
 
There’s no right or wrong answer to the implementation of chromatic aberration, motion blur, or any other post-processing effect. I’m definitely not an advocate of a perfectly artificial image with no post processing, unless the game dictates that level of image quality – something like an online shooter. I actually quite like the film-like presentation of many modern games, it distracts ever-so-slightly from the feeling that I’m playing a game. Bloodborne’s CA has never once provided a negative feeling from me, if anything, I’d go as far as to say that I quite like it and it fits well with the dank and dark presentation of the environments (which are incredible).

As Laa-Yosh stated previously; image quality is an art. You wouldn’t criticise The Burning Monk (as seen on RATM self-titled album) as not having a fast enough shutter to fully capture the fire in non-blurred form, or that the lack of colour distracts from the message provided in the image.

Image quality does tend to go one of two ways; 1. the impossibly perfect image that’s usually presented on any default Windows machine’s desktop / the retouched images that are presented in magazines, 2. The perfectly imperfect imagery provided by art-style photographers, they can have slow shutters, narrow apertures, and grainy images from high ISO films. It’s widely accepted that these ‘features’ on a manual camera actually have benefit. I’m all for the art department of development teams to dictate what’s best for the imagery of their game.
 
Last edited by a moderator:
It doesn't matter how many times it is said, until we don't have displays with infinite framerates, motion blur (that ir, correct mb) is a necessity for mathematically accurate representation of movement, just like AA is a necessity for correct representation of shapes and form on displays without infinite spacial resolution.
Sure, discussing the usefulness of the gross aproximations to accurate MB that games use over just ignoring it is legitimate, much like the discussion of the usefulness of post-AAs that smudge out too much detail, but saying MB is always absolutely wrong is also incorrect. But they'll say it's wrong, and eye tracking, and yada yada yada.
 
Motion blur is not only a recreation of a cinematic effect. It's a recreation of human perception.
Only when it matches what the human is looking at. Motion blur on moving objects that the eye is tracking effectively is wrong, but an unavoidable compromise of our display tech. Overall I think the gains outweigh the losses when done correctly.
 
Surely the amount of motion blur present in the display affects how much should be present in the output?
Not really, those are very different effects.
Display blur is just a artifact of display technology when color changes to another.
It doesn't add information to the scene, it's just an after image.

Proper motion blur creates temporal information about movement, rotation or topological change of objects and viewer.
 
It doesn't matter how many times it is said, until we don't have displays with infinite framerates, motion blur (that ir, correct mb) is a necessity for mathematically accurate representation of movement

Actually I'm not sure if that is completely true. There's a clear limit to how much change the human eye can perceive per second, and it will reconstruct everything inbetween. So at some point, my guess would probably be close to 80fps, graphics wouldn't benefit from motion blur much. But I don't know if actual research has been done in this field.
 
It depends on angular motion. If you have an object pass from one side of the screen to the other in a fraction of a second, let's say a tennis ball across a camera's view, if the display is tiny and at distance it'll look reasonable, whereas if the display is a wall-sized projection filling your FOV, you'll just have strobing at normal framerates. 30fps looks very smooth when moving a few pixels per frame, but discontinuous at larger displacements. 60 fps looks rough for high speed action.

So may as well just blur between frames regardless of the frame interval. Even at 144 fps, movement of a few frames across a large FOV will look like strobing without moblur. And objects that exist for a single frame would just be a flash - a subliminal message to play tennis instead of the yellow-green smudge of a tennis-ball at speed.
 
A fast enough object could potentially fly across the screen without ever apearing on any frame, even if that screen updated faster than a human could perceive. The Screen space motion blur used in most games would not solve that problem, as it needs something to be on-screen to blur it, but a theoretical "perfect" motion blur solution would.
 
Not really, those are very different effects.
Display blur is just a artifact of display technology when color changes to another.
It doesn't add information to the scene, it's just an after image.

Proper motion blur creates temporal information about movement, rotation or topological change of objects and viewer.
Even when comparing a sample-and-hold display to an impulse driven one?
 
A fast enough object could potentially fly across the screen without ever apearing on any frame, even if that screen updated faster than a human could perceive. The Screen space motion blur used in most games would not solve that problem, as it needs something to be on-screen to blur it, but a theoretical "perfect" motion blur solution would.
Yep.

But your eyes would see it. As a smudge, but they'd see the smudge.
To illustrate, consider a refresh of 10 fps. You'd have a smudge across the screen for 1/10th of a second - very visible and, thanks to moblur, quite informative.
 
All this talk about "necessary", "useful" per-object moblur concerning fast moving single objects is worthless applied to full screen motion blur. People don't hate per-object motion blur. They hate fullscreen motion blur.
 
Back
Top