A comparison of motion blur implementations *spawn

Update: I was correct in calling out the TAA in Uncharted 4 as the culprit, just out of curiosity I've watched this demo from Epic again


Higher quality here: http://www.gamersyde.com/news_unreal_engine_open_world_video-16335_en.html

The AA used in that video exibits most of the issues seen in the recent Uncharted 4 footage (although more exaggerated), pictures:
1v2sut.jpg

222sbv.jpg

3fjsrp.jpg

45fs2b.jpg


Very similar issues although ND seem to have iterated on it and improved it a lot since E3 (as have Epic almost certainly). And the Kite video is running on a Titan X so hardware is certainly not the issue here :p

To be honest, i prefer the super clean look in motion that their technique provides! It's certainly an interesting topic, temporal stability over static image quality?

I was surprise by the clean image in UC4 beta and stress test beta...

I don't play screenshot, the choice is easy for me...
 
I like motion blur. That or a 240hz refresh without blur. (Even then, we'd need blur on helicopter blades, for example)
 
New R&C game on PS4 has very nice object blur, especially when collectibles start flying across the screen toward Ratchet.
 
I like motion blur. That or a 240hz refresh without blur. (Even then, we'd need blur on helicopter blades, for example)
rotors can keep a helicopter afloat at 300RPM, which is... 50RPsecond. 240hz should be enough to represent those without motion blur right?
(I always thought, as a child, that in The Matrix, there was an error with the slow rotor turning during the 'rescue morpheus' scene, but I looked it up later :cool: )
 
rotors can keep a helicopter afloat at 300RPM, which is... 50RPsecond. 240hz should be enough to represent those without motion blur right?
300/60=5
But, the actual visual pattern cycles at more like 20Hz if you're considering the symmetry and multiple rotors.
Or maybe more like 30Hz if we're talking about 450rpm helicopters.

But, the answer is still "not really." 240hz should be easily good enough if your goal is, say, to use the sampled data to measure the frequency of the helicopter rotors. But, the spatial gap between time-samples is still big enough that, if you blend a few consecutive samples, you'd get separated ghosts rather than a smooth blur. This would be especially obvious if the framerate was close to a multiple of the rotor frequency, you'd be looking at a group of static blade ghosts (sample a 4-rotor 7.5Hz chopper with a 240fps camera, and you'd see a pack of 32 unmoving blades).
 
I like motion blur. That or a 240hz refresh without blur. (Even then, we'd need blur on helicopter blades, for example)
yes I know some ppl dont want motion blur no matter what, but the thing is there are a lot of things eg plane/helicopter propellers, thrown ball etc that you actually see blurred in REAL LIFE with your eyes.
In fact next time youre on a propeller plane take your smartphone out turn on camera and point it at the propellers and you will see the propeller bent 180 degrees around
 
yes I know some ppl dont want motion blur no matter what, but the thing is there are a lot of things eg plane/helicopter propellers, thrown ball etc that you actually see blurred in REAL LIFE with your eyes.
Yes, and your eyes would create it naturally if looking at video played back at sufficiently high framerate.
 
Yes, and your eyes would create it naturally if looking at video played back at sufficiently high framerate.
I want to add, the advantage of a very high refresh rate screen would be, you could be tracking the fast object with your eyes, and then all the rest would be blurry, but the object would be sharp; like in real life :)
 
There is infinite ways to legitimate motion blur as it is unfortunately implemented (too heavy) in most videogames: It's smooth, it removes the judder, it's nice to look at, it's like in the movies, it's filmic. "I like motion blur"

There are some rare cases when motion blur reproduces in a way what happens in real life: it's when you really can't follow the object with your eyes as in the case of helicopter blades. And it can be helpful to help tracking a fast moving tennis ball, even if not realistic, it's helpful. But those cases are in fact quite rare.

But an object or background that still is (too much) blurred even when you can easily track it with your eyes is and always be unnatural for our brain.
 
I've had a Trinitron CRT running at 120 hz and could easily notice the difference above 85 hz (there used to be a non-motion-blurring rotating textured block demo that showed the effect of higher refresh/ frame rates when they were locked together).

At 120 hz motion looked almost natural, even without motion blur. The performance penalty for real games and apps would, of course, have been high.

But 120 hz on CRT was, as my BMX stunt riding friend would say, "sick as fuck".
 
I can see that The Order probably has the best motion blur of any video game ever produced or in production. But what is generally regarded as the second best game?
 
I wonder if you could make the case for a, say, 500 hz display and only update the frame buffer sparingly depending on how fast a given particle / segment of geometry was moving...

So slow moving background, 50hz with limited motion blur. Fast moving helicopter, 250 hz, rotor blades 500 hz. That kind of think. If necessary render separately based on update rate and then composit. Then you use motion blur where appropriate, or render at higher rates where it works best, or use a combination of the two.
 
Not entirely convinced rendering techniques for 500+ Hz displays really belongs in the console forum where the target display is gonna be 60 Hz for a long time to come. ;) It's a discussio with some merit in an algorithms or display thread, although is anyone making any noise about super fast refresh? Is it a tech even in the distant theoretical roadmaps of IHVs?
 
As far as i know, games are trying to emulate camera motion blur (at least until foveated rendering and eye tracking is used properly in games), some examples from Interstellar (both 35nm film "normal" camera and IMAX 70nm film camera):

35nm
desktop03.18.2016-22.nzs3j.jpg

desktop03.18.2016-22.2ssp2.jpg


IMAX
desktop03.18.2016-22.1msw1.jpg

desktop03.18.2016-22.kls0p.jpg


And these are high end optics that produce the most natural looking result (non-digital).

Games at 30 fps without motion blur look very jarring, i recently tried locking Syndicate on PC to 30 with adaptive vsync and it was unbearable. You can make a case for 60 fps games but i still think motion blur adds to the whole experience. Maybe the question should be how often does motion blur in games, being an approximation and not a brute force approach, show signs of error, and how often is that noticeable in motion.
 
Last edited:
I think zed and HTupolev are right actually.

The example HT gives is correct: when due to aliasing the rotor appears steady, increasing the framerate by 1 makes an obvious and recognizable difference as the former steady appearing rotor starts to move. But this is imo an extreme scenario.

When you have a framerate that nearly resolves the movement, upping the framerate by 1 in this case doesn't make huge difference in aliasing interpretation and I don't think that this could be recognized well.

I still do wonder how temporal reprojection works and what its limitations are. E.g. Regarding temporal aliasing issues?
 
I mean: what is temporal reprojection? Just use a few old frames and interpolate new ones? Can't be that primitive and would not be called reprojection.

So what is temporal reprojection (in a mathematical sense?)
 
Back
Top