Why do some people dislike motion interpolation for anime and real-life movies/series?

Yah, it's a weird thing with movies. They're not really realism. They're surreal. So much of the lighting is carefully designed by adding lights, bouncing lights, removing lights. So much of it has to do with aperture, shutter speed, shutter angle and lenses. I think once you go past 24 fps, all of the formula for how those things relate to each other changes. I don't think cinematographers have figured it out yet because there are so few HFR movies. I imagine there could be a future where you get 60 fps for smooth motion but visuals that look more dramatic like a movie instead of a low-budget televised BBC period drama.
 
Also it can look realistic in the sense you supposedly don't dream in HFR either.
What would this even mean, really?

Human visual perception (or our dream-like construction of it) is never in "frames"; our visual systems and the related brain regions consume data in a contiuous stream or flow rather than any discrete frame(s). There are certainly measurements of time-dimensioned accuity of vision (eg peripheral vision can sense changes at a far higher frequency than the center of our vision) but those are also not measured in "frames" either.

Not that I'm picking on you specifically, however framerate really should never be specifically tied to accurate descriptions of human visual systems.
 
That's why you set the motion interpolation to only handle panning.

For native multi frame rate movie: Spiderman spider verse.
Characters are animated at lower frame rate than camera pans.

Worst offender of low frame rate camera pans is spy x family 2nd ending theme.
 
That weirdness will go away as you realise that the higher the frames per second the more closely the content matches reality.
This so much. People are generally bad at being objective about change, and depending what kind of change an individual may have experienced in their life, they may be change averse or change receptive. A few are change indifferent. It's not that HFR is wrong, is it just that is different.

If you live or work in a change volatile environment, you will accept change much easier than those who do not because it's human nature to acclimatise to a particular norm, where change becomes jarring. But I don't think it's surprising that after years/decades of observing TV/movies at 24-30fps that this does trigger a weirdness wiggy, particularly if you move back and forth between the frame rates of content.
 
What would this even mean, really? Human visual perception (or our dream-like construction of it) is never in "frames"; our visual systems and the related brain regions consume data in a contiuous stream or flow rather than any discrete frame(s). There are certainly measurements of time-dimensioned accuity of vision (eg peripheral vision can sense changes at a far higher frequency than the center of our vision) but those are also not measured in "frames" either. Not that I'm picking on you specifically, however framerate really should never be specifically tied to accurate descriptions of human visual systems.

It's tied inherently because it's tied to how well high frequencies are preserved. So if there's some degree of blurriness we can assume that's not HFR-alike signal. And even if a movie is HFR you won't process most of that information anyway,so it mostly acts to prevent nuisance because of panning.
Now opposed to this there's some evangelizing in this thread how it's "so good" out of the box when 60 isn't all that higher than 24, and supposed to be 300 lines of motion resolution on the usual TV with soap'ed 120hz is still just 600lines, "big deal".
 
That's on purpose, not any kind of fault with the movie.
yes, it was the best movie to show that smoother (higher frame rate) pans are okay.

with motion interpolation, people also can achieve something like that (e.g. large motion vector setting in SVP)
 
It's tied inherently because it's tied to how well high frequencies are preserved. So if there's some degree of blurriness we can assume that's not HFR-alike signal. And even if a movie is HFR you won't process most of that information anyway,so it mostly acts to prevent nuisance because of panning.
Now opposed to this there's some evangelizing in this thread how it's "so good" out of the box when 60 isn't all that higher than 24, and supposed to be 300 lines of motion resolution on the usual TV with soap'ed 120hz is still just 600lines, "big deal".
isnt the blurriness is part of the design?
so it can be 48fps and still got blurs (e.g. the hobbit)

like how games have motion blur toggle/adjustments, high frame rate movies can adjust blurs by adjusting shutter speed. albeit not as freely as video games. as the higher the frame rate, the shutter speed also need to be faster.

but low frame rate movies can be made with super fast shutter speed, resulting in zero blur.
 
It's scene change rate vs. frame rate. Imagine panning an uniform color background vs. complex. Blur is rather akin to former (less complex) therefore less stuttery and in that extreme case ultra low framerate would suffice too. Also conceptually why should be 48hz less than twice blurry? To make it stutter anyway?
 
Back to the original question:
As someone who really like HFR movies (Gemini Man was so awesome in the cinema!) motion interpolation is always crap, but especially for animation. You are totally destroying the impact of the frames and key frames if you do that. The only thing it could be used for is to have less choppy background panning, but that is not worth destroying the animation for.
 
Now opposed to this there's some evangelizing in this thread how it's "so good" out of the box when 60 isn't all that higher than 24, and supposed to be 300 lines of motion resolution on the usual TV with soap'ed 120hz is still just 600lines, "big deal".

Oh wow, that got a good chuckle out me. :D So more than doubling something "isn't all that higher". :p Good one. :D

Regards,
SB
 
Back to the original question:
As someone who really like HFR movies (Gemini Man was so awesome in the cinema!) motion interpolation is always crap, but especially for animation. You are totally destroying the impact of the frames and key frames if you do that. The only thing it could be used for is to have less choppy background panning, but that is not worth destroying the animation for.


Why not use interpolation for background panning and not the characters animations? You can do that with SVP.

Kinda Like how avatar way of water have variable framerate.

The difference is that avatar did it natively, so zero potential for artifacts
 
Why not use interpolation for background panning and not the characters animations? You can do that with SVP.

Kinda Like how avatar way of water have variable framerate.

The difference is that avatar did it natively, so zero potential for artifacts

Yes, if the director wants to make it in a higher frame rate it is perfectly possible. But there is little point having a computer interpolate (and often getting it wrong) when your brain can do it just fine.
 
Yes, if the director wants to make it in a higher frame rate it is perfectly possible. But there is little point having a computer interpolate (and often getting it wrong) when your brain can do it just fine.
Unfortunately I don't know how to make my brain handle choppy panning just fine. Motion interpolation instantly fixed that (best with SVP, okay most of the time with dejudder at 1 on lg CX)

video games with heavy blur like the last of us are fine, but most movies didn't have that kind amount of blur and panning just a choppy mess.

Fortunately, most movies didn't do panning scenes for most of the time (maybe the director already know the risk of choppy panning scene so they deliberately only strategically use panning scenes?)
 
This always fascinated me. So many humans have been conditioned over years or decades that 24 or 30FPS is the "right" framerate for television and movies, yet here we are on a 3D-graphics-centric internet forum where we aim for high visual fidelity at high framerates.

Why are high framerates looked down upon as the "soap opera effect" when in fact they should be preferable and more life-like? I know the techincal origin of the definition of "soap opera effect" btw, before someone tries posting it up as a non-sequitor ;)
Because in 3d games your mind understands it’s all fake.

In film your mind wants the illusion to be real. All the extra temporal information helps dispel that illusion.

Some people didn’t mind the hfr of the hobbit but to me the fakeness within scenes stood out. They weren’t rocks but rather something that looked like styrofoam painted to look like rocks. You r in a hobbit hole lit with candles or oil lamps so why is the dining room so bright? In reality, films are mostly shot with staged settings. The more details you provide the easier it is for you to perceive that the settings aren’t really natural.


C70118DE-832C-4CDA-8751-7709EE6C8379.jpeg
 
Last edited:
Well plenty of people go to watch live theatre - it can be enjoyable even though it is as fake as it can get.

But I completely agree with dobwal, and others bringing up these points.
 
Hmmm, what about a movie like Collateral? No sets or even lights like in studio-shot movies. Probably the best case for increased frame rate in domain of movies.
 
Also, motion interpolation would really destroy the point of a lot of scenes in movies like Spiderman: Into the spider-verse and Puss in Boots 2.
 
Also, motion interpolation would really destroy the point of a lot of scenes in movies like Spiderman: Into the spider-verse and Puss in Boots 2.
i saw into spiderverse with motion interpolation and it looks amazing. especially the panning scenes (it have lots lots of camera pans and swivel!) and the chase scenes.

you also can limit the interpolation to only interpolate camera pans, if you want, with SVP.

interstellar on the other hand, are horrenduous with LG CX "cinema clear" motion interpolation. the starts became a mess when the camera pans. works great with SVP and LG CX "dejudder at 1". SVP still way better result tho (smoother and stay clean off weird visuals)

EDIT:
IIRC ive read on rtings that LG motion interpolation is not the best. the best is sony.

i dont have a sony tv, but compared to SVP, LG's interpolation is worse.

EDIT:
i wonder if nvidia DLSS3 motion interpolation is better than SVP. or whether video players (and SVP) gonna incorporate DLSS3 motion interpolation.

EDIT
lol turns out SVP has added DLSS3 motion interpolation
1672634070859.png
 
Last edited:
Back
Top