It is predictive (it uses previous frame + motion vectors + ofa) and it is artifact city. There's just enough people (here too) who completely avoid that particular side of the topic so it doesn't get much attentionYeah, more I think about it, the more it doesn't make any sense. Why would I want a NN to generate subpar frames, when I could just have my monitor or TV do a much better job for the same latency cost and 0 performance hit?
Edit: The only way it could make sense to me is if it is predictive, but that seems like a recipe for artifact city? Maybe us humans just won't/don't notice the glitches in the matrix? (x)
Yeah, I was wrong about DLSS and where it went 2.0 (it is great). Still a bit of softness but that is a good (correct) tradeoff to make, if you have to make one.
But NNs improving is one thing, time travel is another. You can also already get a much superior result with very little latency (at decent framerate), with solutions that already exist.
With the 'Game Motion Plus' settings at max, there's 28.2ms of input lag, which is higher than with the setting disabled [me: 10ms without it], but it's still good for casual gamers.
You will need an additional frame to interpolate, and unless you invent time travel, that will take time. Nobody has suggested defying the laws of physics afaik.
and it is artifact city
Haven't checked FS myself but Cyberpunk exhibits similar, even if less pronounced, artifacts at least in parts of the DF preview (which was controlled by NV)Spider-Man still has troubles doing proper DLSS2 to this day (their upscaled RT refections require special handling when upscaling), so it's only natural that their early DLSS3 footage from the game has artifacts.
Comparatively, Cyberpunk and Flight Simulator made very solid showings for DLSS3.
So I would appreciate it, if you learn a little bit from the past 4 years, and wait before you shun a tech based on one early showing. Especially when you know that NVIDIA is going to constantly hammer it down to almost perfection.