Nvidia DLSS 3 antialiasing discussion

These very high framerate interpolation scenarios are not very realistic. For VR in the short term strobing refresh is a more realistic way to increase motion smoothness.
 
Yeah, more I think about it, the more it doesn't make any sense. Why would I want a NN to generate subpar frames, when I could just have my monitor or TV do a much better job for the same latency cost and 0 performance hit?

Edit: The only way it could make sense to me is if it is predictive, but that seems like a recipe for artifact city? Maybe us humans just won't/don't notice the glitches in the matrix? (x)
 
Yeah, more I think about it, the more it doesn't make any sense. Why would I want a NN to generate subpar frames, when I could just have my monitor or TV do a much better job for the same latency cost and 0 performance hit?

Edit: The only way it could make sense to me is if it is predictive, but that seems like a recipe for artifact city? Maybe us humans just won't/don't notice the glitches in the matrix? (x)
It is predictive (it uses previous frame + motion vectors + ofa) and it is artifact city. There's just enough people (here too) who completely avoid that particular side of the topic so it doesn't get much attention

Edit: here's my post from another thread for reference
 
Lol, well unless Nvidia is also planning on inventing time travel.... I'll pass.

Jeebus, that spiderman scene isn't even challenging and it screws up a static feature...

Lol at the shadow.... Ray trace your lighting to destroy the accuracy with a trash gimmick that artificially inflates FPS numbers.
 
Well DLSS 1 wasn't very good either but it was a necessary step to work towards what we have now, which is quite good. I personally don't like to discount a possible technology based on initial implementations. There are many commercial technologies that aren't very good at the start but require commercial use through consumers to reach their goals.
 
Yeah, I was wrong about DLSS and where it went 2.0 (it is great). Still a bit of softness but that is a good (correct) tradeoff to make, if you have to make one.

But NNs improving is one thing, time travel is another. You can also already get a much superior result with very little latency (at decent framerate), with solutions that already exist.
 
Yeah, I was wrong about DLSS and where it went 2.0 (it is great). Still a bit of softness but that is a good (correct) tradeoff to make, if you have to make one.

But NNs improving is one thing, time travel is another. You can also already get a much superior result with very little latency (at decent framerate), with solutions that already exist.

Which one? AFAIK there doesnt exist any latency free motion interpolation.
 
Yes 1 frame at 60hz is 16.7 ms. So a whopping 1.5ms additional processing time. The horror.
 
Last edited:
Not frame, time to display the signal. With MI the S95b needs 28.2ms instead of 10ms to display the frame from the source - this is nearly 3x longer.
 
You will need an additional frame to interpolate, and unless you invent time travel, that will take time. Nobody has suggested defying the laws of physics afaik.
 
You will need an additional frame to interpolate, and unless you invent time travel, that will take time. Nobody has suggested defying the laws of physics afaik.

I've only seen that used on the actual card marketing materials, but not in these discussions.
 
and it is artifact city

Spider-Man still has troubles doing proper DLSS2 to this day (their upscaled RT refections require special handling when upscaling), so it's only natural that their early DLSS3 footage from the game has artifacts.

Comparatively, Cyberpunk and Flight Simulator made very solid showings for DLSS3.

So I would appreciate it, if you learn a little bit from the past 4 years, and wait before you shun a tech based on one early showing. Especially when you know that NVIDIA is going to constantly hammer it down to almost perfection.
 
Spider-Man still has troubles doing proper DLSS2 to this day (their upscaled RT refections require special handling when upscaling), so it's only natural that their early DLSS3 footage from the game has artifacts.

Comparatively, Cyberpunk and Flight Simulator made very solid showings for DLSS3.

So I would appreciate it, if you learn a little bit from the past 4 years, and wait before you shun a tech based on one early showing. Especially when you know that NVIDIA is going to constantly hammer it down to almost perfection.
Haven't checked FS myself but Cyberpunk exhibits similar, even if less pronounced, artifacts at least in parts of the DF preview (which was controlled by NV)
 
Back
Top