Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

There's no mention in the piece if they were using the latest updates released this week, I would assume so but it would be nice to get confirmation. Particularly in the case of Crysis 3, which received a DLSS update that supposedly improved the ghosting on vegetation - albeit they mention it, but don't really show comparison images where it's visible.

They also seem to have mixed up the post-processing TXAA vs DLSS image comparisons, at least going by their description of how inferior DLSS is when zooming in with the bow and DOF is applied - yet their image comparisons show the exact opposite.
 
A different opinion on DLSS in GOTG:
Usually, DLSS doesn't fool you when it comes to image stability, but it just doesn't work in Guardians of the Galaxy. On the contrary: the image stability is already worse with DLSS on "Quality" in all resolutions than with the TAA of the game and that is not convincing in this aspect either. Some objects are smoothed absolutely flawlessly with DLSS, but even DLSS on "Quality" flickers visibly more even in Ultra HD than without Nvidia's intelligent AI upsampling. If a more aggressive DLSS mode is set, the flicker intensifies accordingly.

This is the biggest problem with DLSS in the game. But there is also a second major construction site: DLSS “got worse” also in GotG graphic errors. They come from the game, but are only slightly noticeable with the normal TAA. In fact, the characters in the numerous cutscenes like to struggle with ghosting. In addition, the detailed hairstyles of the characters are not shown very cleanly. With DLSS, both effects are exacerbated. This is not a problem when playing, but it is in the sequences, because the ghosting is immediately noticeable. Here too, lower resolutions and more aggressive modes make the problem worse, but in Ultra HD it is already well defined with DLSS on "Quality". The same applies to a violent moiré effect in the many grids that the player walks over again and again. These are generated by the game itself, and TAA also has to struggle with them accordingly. But DLSS additionally intensifies the effect.

A quasi-standard construction site is incorrectly set mip-map levels at DLSS. As a result, the game does not use the Mip-Maps for the target resolution, but only those for the rendering resolution. As a result, some objects are displayed in less detail than the native resolution, while other details are completely lost. This only has a small impact on the image quality, but it is still a negative point.
https://www.computerbase.de/2021-10.../#abschnitt_die_bildqualitaet_von_nvidia_dlss
 
So the problem with Severed Steel is I'm basically cpu limited in the DX12 mode, which is the only way to enable DLSS. In the DX11 mode performance is WAY better. Not really surprising for UE4.
Why would they lock DLSS under DX12? You sure that you can't use it in DX11?
 
Back
Top