Post processing: Motion blur....why?

ShootMyMonkey - do the shaderized versions use anisotropic filtering to perform the blurring (use the framebuffer as the texture to filter, and magnitude/direction of the per-pixel velocity vector to determine anisotropy)?
 
An example in real-life (TM) is the effect you get with strobe lighting. It doesn't look correct.
The best example I've seen is the diving events at the olympics. They typically 'film' them using a 1/1000 or faster shutter so on the slow motion it's not just one big motion blur. When I first saw this at the Barcelona games I spent two or three days trying to work out why it looked so odd, especially where the water splashes.

Good motion blur will be a good thing. But it's very, very tough. The best example I've seen so far are in the demo videos for one of the next-gen F1 games, which is using a depth-peeling / directional blur filter algorithm.
 
If you want to see the benefits of motion blur, ask yourself why the dinosaurs in the original Jurassic Park look so much better than Harryhausen-style stop-frame animation. It's easy enough to see: just pause the film. A stop-frame model appears completely sharp in each frame. The Jurassic Park dinosaurs have realistic motion-blur, as if they had actually been moving in front of the camera while the shutter was open.

Another illustration is what happens in TV and film when they set the camera to have a faster shutter-speed. This is an effect used (indeed, overused) in many shows and films to make scenes seem dramatic or fragmented. It's often done over the top of battle-scenes. Makes everything look too sharp and too jerky. The show Over There uses it a lot, and so does the BBC's new version of Robin Hood.

As already mentioned, the fact that there is motion-blur in films (caused by the camera's shutter usually remaining open for at least half the duration of the frame) is precisely what allows cinema film to give an impression of smooth motion even though the frame rate is only 24fps. Computer-generated images without motion-blur look really choppy at 24fps, and you need a minimum of 60 to get even close to feeling smooth. (There are, of course, other aspects to that, like controller-latency).
 
movies can also look really choppy (when a camera is panning for instance), and movie directors' job is to avoid those situations that cause too much jerkyness.
motion blur is a nice trick but is not everything, you still need more framerate to get more info.

at least for games, there's a way I like : have it run at 100fps on a 100Hz screen, that's very smooth and very low-latency. 60fps gaming is a bit jerky in comparison.
 
movies can also look really choppy (when a camera is panning for instance),
That's (probably) mainly because, as nicolasb points out, the camera shutter is open for only 1/2 the frame period. A full-length exposure would improve things but it'd be technically challenging to build such a camera.**


** One solution I though of was to have two rolls of film, two shutters running out-of-phase, and a "splitter" after the lens. A post process of the two films could then be used to merge them into a single sequence. <shrug>. Of course, it'd be cheaper to do with a digital camera :)
 
at least for games, there's a way I like : have it run at 100fps on a 100Hz screen, that's very smooth and very low-latency. 60fps gaming is a bit jerky in comparison.
I'd still like to see motion blur or higher FPS. The difference between 60fps and 100fps is astounding, but still at 100 fps it doesn't feel realistic.
 
ShootMyMonkey - do the shaderized versions use anisotropic filtering to perform the blurring (use the framebuffer as the texture to filter, and magnitude/direction of the per-pixel velocity vector to determine anisotropy)?
There are more than a few methods, some of which use the framebuffer, some of which apply it when rendering each model (which means that the blur blurs within the geometry bounds of the models themselves -- looks horrid on a screenshot, but at 60 fps, it's difficult to notice). The closest thing that I know of to explicitly affecting the anisotropy is the methods that subdivide the 3D velocity vector and uses the projection matrix to re-project each offset point into 2d so you get UV offsets which are perspective-correct rather than being linearly spaced in 2d.

I'd still like to see motion blur or higher FPS. The difference between 60fps and 100fps is astounding, but still at 100 fps it doesn't feel realistic.
Well, it doesn't feel right even at 100 fps because what you get is still snapshots... If you have BOTH motion blur and high fps, it's actually quite a bit better. Though the question of "realistic" reaches a bit broader than that, but steps are steps.
 
Back
Top