It's not about moving and static shots, but the capturing system introducing a tiny degree of blur enough to destroy the benefits of 4k. That is, a pin-sharp pixel at 1080p could be 4 blurred pixels at 4k, such that those pictures captured or upscaled won't look much different. We all agree there are upper limits on resolution above which there's no benefit, whether that's 10k pictures or 2k or 44 kHz audio or 192 kHz or 60fps or 120 fps or 2000 fps - at some point the added cost of supporting nigh-imperceptible qualities means there's a ceiling there's no point in chasing. That limit is also going to be a matter of compromises of what can be realistically achieved with current tech, so even if 192 kHz audio is better than 48 kHz, we won't be able to use that per channel for a good while yet and investment in chasing that higher quality should be directed at solving more important issues like video datarates and lesser compression.
In the case of 4k, I question the resolving power of most cameras in use (with motion, either larger scale or tiny hand movements, and DOF blur) to render significant difference between an image captured at 4k and displayed on a 4k screen, and an image captured at 1080p and upscaled to that same screen. The real difference across the frame is probably going to be about a few percent by my guess. That will be much better in certain areas, such as static eye shot, where the viewer's focus is, which is worth pursuing for those with a large enough FOV and will give a better impression of the improved quality too - with a small area of interest in higher detail, the periphery which is in no great detail (out of perfect focus) will not be appreciated as being out of focus. And that periphery isn't the focus of higher resolution anyway.
But the point is, for 4x the data you're getting a marginal increase in quality for a small niche of the population. Whereas improved framerate would get a far, far more noticeable improvement in quality for only twice or three times the data. It'd improve not only temporal quality but also perceived detail which is constructed over multiple sample. This is true in games also even where perfect pixels will give the best advantage to 4k. I'd be interested to see a result of gamers exposed to two different setups - a 4k game at 30fps and the same game at the same FOV at 1080p60 - and see which they would prefer to play. Personally I'd take 1080p60 on the sets and viewing distances I'll experience. I can't envisage any situation where I have a screen large enough/close enough to benefit from 4k on games.