IMRs and temporal coherency

I kind of wonder whether motion blur effects are really even a priority in 3D rendering. I mean, honestly, as fast as you can move in UT and Q3 you're not moving fast enough that a blur would look anything other than ridiculous. Motion blurs only occur in very limited instances when something is moving fast and you are relatively close. If a jet plane passes directly overhead it will be a big blur (and probably blow your ear drums out), but if you see it from a mile away it doesn't blur at all because in your perspective it's not even moving that fast.

I think the truth is motion blur effects will end up having a bad tendency to be over used and really won't add much to the realism of games. The only exceptions are maybe racing games or possibly flight sims. But there are already ways to make a poor man's motion blur that it seems to me would suffice. I'm just not sure motion blur is ever going to get much attention, nor whether it is important enough to deserve it anyway.
 
Motion blur only really makes sense when you would otherwise have a framerate several times the monitor refresh rate - if motion blur slows down the framerate to below the monitor refresh rate, it just gets in the way, smearing everything for no good reason. Also, there is the problem of keeping a fast moving object from becoming N translucent objects following each other. Good motion blur may well take gpus that are an order of magnitude more powerful than the ones available today.
 
pcchen said:
However, it is clearly wrong. Some people believes that 48kHz is enough for digital sampling of audio. However, there are still some people who can hear the difference between 48kHz and 96kHz even on the same system.
:rolleyes: No doubt those are the people who use Valve amsp and 'molecularly aligned, super cables'.

Seriously though, if a poor quality filter was used to convert from digital to analog might then you might be able to hear the difference.
 
Nagorak said:
I kind of wonder whether motion blur effects are really even a priority in 3D rendering. I mean, honestly, as fast as you can move in UT and Q3 you're not moving fast enough that a blur would look anything other than ridiculous. Motion blurs only occur in very limited instances when something is moving fast and you are relatively close. If a jet plane passes directly overhead it will be a big blur (and probably blow your ear drums out), but if you see it from a mile away it doesn't blur at all because in your perspective it's not even moving that fast.

Motion blur can most certainly improve the rendering quality of scenes by a huge amount, though not if it drops the framerate too much.

Motion blur would primarily be beneficial if a technique were used that was either better looking or faster than rendering a comparable number of discreet frames. So, one situation where it's a no-brainer is when you have framerates in excess of the monitor's refresh (i.e. 100+ fps). However, if methods can be developed to provide higher performance/image quality, then it may be better for other situations as well. Quick example: multisampling AA. With a GeForce4, you can usually turn on 2x AA for little to no performance hit. It would be nice if we could see some limited motion blur in the future with a similar performance hit.

Now, the big question: why is it beneficial? The answer is simple. We're very used to seeing motion blur in the real world. We're so used to it that we don't even notice that we're seeing it. And yet, it most certainly is there. That is, the human eye actually takes about 1/8th of a second to detect light (However, I imagine each cell in your eye isn't synchronized, meaning greater response time in reality, but it will still take 1/8th of a second for the entire image to register). That basically means that there is motion blur all around us every day. Our brains are just trained to filter it out.

In a similar way, applying motion blur to games will provide massive improvements in apparent performance and image quality. Just like movies look quite good at a mere 24 fps, games shouldn't need to go above about 40-60 fps with good motion blur (Quick note: higher fps are generally beneficial for improving input response time, not necessarily for improving image quality).
 
Two things,
That is, the human eye actually takes about 1/8th of a second to detect light
is incorrect for most folks eyes, we are able to detect light in much finer increments and
Just like movies look quite good at a mere 24 fps,
Is a flawed argument considering movies ARENT simply 24fps (each frame is shown multiple times), the conditions in the theatre add to the effect of "okay"ness (we have no other light or moving objects to reference so our brain smooths it for us), and even still movies are still noticibly flickery, choppy.
 
Simon F said:
pcchen said:
However, it is clearly wrong. Some people believes that 48kHz is enough for digital sampling of audio. However, there are still some people who can hear the difference between 48kHz and 96kHz even on the same system.
:rolleyes: No doubt those are the people who use Valve amsp and 'molecularly aligned, super cables'.

Seriously though, if a poor quality filter was used to convert from digital to analog might then you might be able to hear the difference.
He said 'same system', which would presumably mean that the same filter was used.

Actually, sampling of sound at N hertz tends to result in aliasing effects at sub-multiples of N, so-called 'beating', somewhat similar to what happens if you have two vibrating strings almost but not perfectly tuned to the same tone (the analoguous graphics effect would be some sort of a moire pattern). For CD sound (44100 Hz), there is a slight, but noticeable one around N/4 (11025 Hz). It will probably take some fairly well trained ears to notice this kind of stuff below ~10 kHz, though.
 
Chalnoth said:
That is, the human eye actually takes about 1/8th of a second to detect light (However, I imagine each cell in your eye isn't synchronized, meaning greater response time in reality, but it will still take 1/8th of a second for the entire image to register). That basically means that there is motion blur all around us every day. Our brains are just trained to filter it out.
But does the "latency" necessarily have something to do with the "bandwidth" of the eye? I'm not necessarily disagreeing with your conclusion, I just find the argument a bit lacking.

Chalnoth said:
games shouldn't need to go above about 40-60 fps with good motion blur
If your fps/CRT refreshrate aren't 1-1 you will get some choppiness due to interference between the fps and refresh frequencies.
 
I disagree that motion blur will add almost anything to a game. I don't know...I guess you guys are welcome to have your own opinions, but the truth is motion blur isn't just at the bottom of my personal wish list, it's not even on it at all. ;)

Of course I also think FSAA is more or less unneeded crap, so it's probably just me. 8)
 
arjan de lumens said:
Simon F said:
pcchen said:
However, it is clearly wrong. Some people believes that 48kHz is enough for digital sampling of audio. However, there are still some people who can hear the difference between 48kHz and 96kHz even on the same system.
:rolleyes: No doubt those are the people who use Valve amps and 'molecularly aligned, super cables'.

Seriously though, if a poor quality filter was used to convert from digital to analog might then you might be able to hear the difference.
He said 'same system', which would presumably mean that the same filter was used.
I didn't understand if you were agreeing or disagreeing with me.
My assumptions are that
  • the original signal was bandwidth limited, say <= 20khz.
  • The reconstruction is done 'correctly' (or near enough to it)
If either of these conditions has not been met then moving to the higher sampling rate will probably help hide the deficiencies.

Actually, sampling of sound at N hertz tends to result in aliasing effects at sub-multiples of N, so-called 'beating', somewhat similar to what happens if you have two vibrating strings almost but not perfectly tuned to the same tone
Errr... if that's happening then surely it's because your analog sound contains signals that are >= N/2 Hz, i.e. you've got aliasing. You must correctly lowpass filter your original sound.

For CD sound (44100 Hz), there is a slight, but noticeable one around N/4 (11025 Hz). It will probably take some fairly well trained ears to notice this kind of stuff below ~10 kHz, though.
If that were aliasing, then noise at 11Khz would (most likely) correspond to an illegal frequency of 22+11 = 33khz. That seems unlikely since it'd pretty easy to eliminate those frequency ranges.

More likely it'd be due to shoddy reconstruction. To do it properly, you should sum up weighted Sinc functions, but I doubt anyone does that (even for for a truncated/windowed sinc). If you don't use a sinc then, IIRC, the highest representable frequency is effectively much lower than 1/2 the sampling rate.

I disagree that motion blur will add almost anything to a game. I don't know...I guess you guys are welcome to have your own opinions, but the truth is motion blur isn't just at the bottom of my personal wish list, it's not even on it at all. [icon_wink.gif]

Of course I also think FSAA is more or less unneeded crap, so it's probably just me. [icon_cool.gif]
These arguments reappear time and again on B3D. Temporal AA would add to the 'reality' of the game, but it is expensive to do it properly.

Since you think AA in general is a waste of time, I'm probably wasting my time here ;) but, nevertheless, here goes...

Some TV sports events are videoed using "high-speed" video cameras which, rather than averaging over the entire field/frame period, capture a frame very quickly. For the sake of the argument, lets say it's 0.001 seconds. These video pictures make wonderful freeze frame stills, but when you play them as normal video, the animation looks disturbing and not smooth.

These cameras are analogous to the way computer games graphics are generated. Each frame represents an instant in time rather than a finite period (as you get with normal video). If you could sample through the entire frame period, then the game would look far more convincing.
 
Simon F said:
I didn't understand if you were agreeing or disagreeing with me.
My assumptions are that
  • the original signal was bandwidth limited, say <= 20khz.
  • The reconstruction is done 'correctly' (or near enough to it)
If either of these conditions has not been met then moving to the higher sampling rate will probably help hide the deficiencies.
I made pretty much the same assumptions myself - that the original signal was lowpass filtered at ~20 kHz or less and that reconstruction was linear, presumably using a box or triangle filter. Moving to a higher sample rate may reduce aliasing and 'beating' artifacts, but shouldn't have any other effect with the system being otherwise the same ..?
Actually, sampling of sound at N hertz tends to result in aliasing effects at sub-multiples of N, so-called 'beating', somewhat similar to what happens if you have two vibrating strings almost but not perfectly tuned to the same tone
Errr... if that's happening then surely it's because your analog sound contains signals that are >= N/2 Hz, i.e. you've got aliasing. You must correctly lowpass filter your original sound.
Try firing up goldwave or some other program that lets you generate a pure synthetic waveform. With the sample rate being N Hz, generate a sine wave with a frequency of (N/2)-1 Hertz. If you haven't tried this before, the result might surprise you: what you get is effectively a wave at N/2 Hertz, with a time-varying amplitude following a sine wave at 1 Hz. That is 'beating'. A similar effect occurs near frequencies of N/X Hertz, where X is an integer, although its severity diminishes rapidly with increasing X.

For CD sound (44100 Hz), there is a slight, but noticeable one around N/4 (11025 Hz). It will probably take some fairly well trained ears to notice this kind of stuff below ~10 kHz, though.
If that were aliasing, then noise at 11Khz would (most likely) correspond to an illegal frequency of 22+11 = 33khz. That seems unlikely since it'd pretty easy to eliminate those frequency ranges.

More likely it'd be due to shoddy reconstruction. To do it properly, you should sum up weighted Sinc functions, but I doubt anyone does that (even for for a truncated/windowed sinc). If you don't use a sinc then, IIRC, the highest representable frequency is effectively much lower than 1/2 the sampling rate.
A sinc filter may be able to reconstruct the (N/2)-1 Hz frequency above, although the computation cost would be rather large. In practice, you're more likely to get a triangle filter or something like that (=shoddy, I presume), where the (N/2)-1 Hz frequency is, as indicated, non-representable, and beating at (N/4)+small value Hz is still noticeable and not quite going away with any reasonable-length filter.
 
arjan de lumens said:
Try firing up goldwave or some other program that lets you generate a pure synthetic waveform. With the sample rate being N Hz, generate a sine wave with a frequency of (N/2)-1 Hertz. If you haven't tried this before, the result might surprise you: what you get is effectively a wave at N/2 Hertz, with a time-varying amplitude following a sine wave at 1 Hz. That is 'beating'. A similar effect occurs near frequencies of N/X Hertz, where X is an integer, although its severity diminishes rapidly with increasing X.
No, that doesn't suprise me in the least. You've played it back with 'less than ideal' hardware. The example you gave is right on the theoretical limit and so you MUST reconstruct the signal using the sinc function. A triangle certainly does not 'cut the mustard'.
 
Simon F said:
No, that doesn't suprise me in the least. You've played it back with 'less than ideal' hardware. The example you gave is right on the theoretical limit and so you MUST reconstruct the signal using the sync function. A triangle certainly does not 'cut the mustard'.
Exactly. And I claim that similar, if weaker, filtering artifacts also appear at lower sub-multiples of the sampling rate with non-sinc filters, making it possible to actually hear the difference between a perfect sinc filter and a simple triangle filter on CD sound with just the right frequencies - the difference between 48 and 96 kHz sound corresponds roughly to this difference on most actual audio hardware.
 
tobbe said:
But does the "latency" necessarily have something to do with the "bandwidth" of the eye? I'm not necessarily disagreeing with your conclusion, I just find the argument a bit lacking.

Not exactly sure what you mean. I don't think there's any limitation in bandwidth at all. It's just that it takes time for different cells to register how much light is hitting them.

If your fps/CRT refreshrate aren't 1-1 you will get some choppiness due to interference between the fps and refresh frequencies.

Well, it may be possible for motion blur techniques to be used to keep the framerate within a specified range. For example, the hardware could automatically adjust the degree of motion blur used each frame dependent upon certain factors (easiest one would be previous frame's draw time), keeping the framerate from varying too much. The higher the framerate, the more effective a technique like this would be.

I disagree that motion blur will add almost anything to a game. I don't know...I guess you guys are welcome to have your own opinions, but the truth is motion blur isn't just at the bottom of my personal wish list, it's not even on it at all.

Of course I also think FSAA is more or less unneeded crap, so it's probably just me.

I just have to say, Nagorak, that you'll discover how good it looks when it's finally implemented. Motion blur is coming. It might be a few years off yet before it's used for anything but special effects, but it is coming.
 
Back
Top