So if I’m understanding correctly, they want developers to output 4K mips instead of the 1080p mips?
4K or 1080p are meaningless things on PC. Texture MIP LODs must be adjusted according to the output resolution when running DLSS, not rendering resolution.So if I’m understanding correctly, they want developers to output 4K mips instead of the 1080p mips?
Thanks just wanted to confirm. But IIRC texture sizes do matter for LOD levels. If you have a 4096x4096 texture you have more LOD levels than 512x512. So isn’t this somewhat related ? I guess related to texture resolution and not so much rendering resolution ?4K or 1080p are meaningless things on PC. Texture MIP LODs must be adjusted according to the output resolution when running DLSS, not rendering resolution.
Texture LODs are generated automatically. It's only a question of setting up the renderer to do LOD bias correction according to output (display) resolution instead of rendering resolution. Even if your textures are 16x16 you will still have visible surfaces where a lower MIP bias would produce a sharper image on a display with more physical pixels.Thanks just wanted to confirm. But IIRC texture sizes do matter for LOD levels. If you have a 4096x4096 texture you have more LOD levels than 512x512. So isn’t this somewhat related ? I guess related to texture resolution and not so much rendering resolution ?
DLSS upscaling has become a very attractive way to improve the performance of games. While integrating and using the SDK itself is straightforward, there are some common challenges with a sophisticated engine like Frostbite. These include correct handling of motion-vectors, particle effects, half-resolution effects, and texture quality. It's also important to address the depth buffer, which is currently not upscaled with DLSS. We'll discuss how we have integrated DLSS with the latest BattleField installment, the challenges we encountered, and the solutions learned.
Avengers always had this DLSS option AFAIR.Oh I found the old quality modes too. Dynamic resolution which is on by default, was blocking them from appearing. So maybe not a recent change after all and "DLAA" has been in the game longer, dunno.
I'm actually seeing resolutions between 67% and 100% when entering a suitable framerate target.But it seems to be just switching between the same old DLSS presets based on current framerate so it's not a true "dynamic resolution" solution.
Valve, the company behind the highly-anticipated Steam Deck handheld console, has posted an update (via Phoronix) for both the standard and experimental Proton compatibility layers that allow Windows games to run on Linux. The experimental version supports Nvidia Deep Learning Super Sampling (DLSS) in DirectX 12 games, while the regular version broadens support to more game titles.
...
Previously, Valve only used the DLSS function in games that ran on the Vulkan API. However, Valve's experimental support now allows DirectX 12 games to run DLSS without a problem, and it will arrive in the stable Proton branch after further testing.
...
To get this to run, you'll need to compile the latest version of the Proton Experimental branch, install the latest Nvidia drivers, and set the "PROTON_ENABLE_NVAPI=1" environment variable.
It's DLSS or both. It doesn't add DLSS to anything, just allows you to use it on Linux on DX12 games that support it already on WindowsSo is that DLSS or DLAA?
It's DLSS or both. It doesn't add DLSS to anything, just allows you to use it on Linux on DX12 games that support it already on Windows
Although DLSS is a valid option for RTX card users in Alan Wake Remastered, potential is left behind. First, Remedy and d3t (like many developers before) seem not to adhere to Nvidia's DLSS guidelines with regard to the texture LOD determination: In Alan Wake Remastered, DLSS reduces the texture quality, especially for removed texture objects (such as trees) speaks for a lack of adaptation. Nvidia stipulates that DLSS works with the texture LOD of the target resolution - in fact, however, the MIP maps of the internal resolution are used in most games. Second, the question of the sense arises: According to PCGH tests, Alan Wake Remastered runs very smoothly even in Ultra HD - we deliver benchmarks afterwards - so that the relieving effect of DLSS is not required. From our point of view, Nvidia would be well advised Enable Deep Learning Anti-Aliasing (DLAA) in all DLSS games.