Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

Yessir. This of course assumes you've enabled the (DL)DSR options in the NVIDIA driver, of course.

In which case its pretty simple IMO although I appreciate how we can make it seem over complex here by analysing the inner workings of these up/down scaling solutions.

In reality though, you simply turn on DLDSR in the driver (arguably it should be on by default), then you select the resolution you want in game like you always would (higher being better) and off you go.

Then if performance isn't high enough, you turn on DLSS at the level that brings you back to the performance to you want. If that doesn't work, lower the resolution.

That's the joy of PC gaming.
 
In which case its pretty simple IMO although I appreciate how we can make it seem over complex here by analysing the inner workings of these up/down scaling solutions.

In reality though, you simply turn on DLDSR in the driver (arguably it should be on by default), then you select the resolution you want in game like you always would (higher being better) and off you go.

Then if performance isn't high enough, you turn on DLSS at the level that brings you back to the performance to you want. If that doesn't work, lower the resolution.

That's the joy of PC gaming.

No the joy of pc gaming is endlessly tweaking settings and benchmarking so much that you’re bored of your games before you finish the tutorial levels.
 
No the joy of pc gaming is endlessly tweaking settings and benchmarking so much that you’re bored of your games before you finish the tutorial levels.
So I'm not the only one? :D :D

In my much younger years, I honesty did find joy in the tweaking and benchmarking of games. Given how long most of us have been on this forum, I'm far from the only one who started their game tweaking days by brutally optimizing config.sys and autoexec.bat. Sure any dipweed could drop himem.sys, emm386.exe /noems and a dos=high,umb into config.sys and pick up a decent 580kb of low memory. Then you'd get the intermediate folks who would build boot options into config.sys which also permitted branching in autoexec.bat so you could pick-and-choose for which drivers made sense to actually load. Then you got us voodoo ninja badasses who would override emm386.exe's attempt at autodiscovering empty upper memory (640k -> 1mb region) blocks, forcing the inclusion of upper memory regions where (hopefully!) the various bioses weren't shadowed into which inevitably would crater your machine (i=b800-bfff x=c000-c7ff i=d000-d255) Then the real massochists would reorient the load order of DOS drivers to optimize how they all fit into the newly recovered upper memory region space to exactly get smartdrv crammed in there alongside ansi.sys and mouse.drv and oakcdrom.sys and mscdex, you could even loadhigh the entire command.com if you were balsy.

Remember when overclocking was a static-speed soldered crystal, eventually augented by jumpers to select from several base speeds with divider jumpers for ISA and VLB speeds, only eventually to finally start permitting clock multiplier jumpers? I remember the very first 83MHz capable motherboard I owned and I ran my Pentium 166MMHz with a 3x multiplier like a badass -- it took me days to get it perfectly stable, and then like a month later I cooked a cheap PCI network card because the PCI bus was running at 41.6MHz lol.

Thirty years later, I'm not there anymore. Yeah I run an overclocked rig, yeah also spent like 30 minutes testing it and then started turning it down a little each time I found a game that crashed. No matter that I'm too old for that shit, I'm never going to begrudge anyone who wants to spend hours, days and even weeks tweaking their rig over and over, benchmarking the same dumb shit over and over, just to finally get that last little half-perecent performance increase they stumbled into to now remain stable.

To those who find joy in benchmarking and tweaking instead of playing: good for you. No small part of where I've landed professionally came from those days doing that same shit.
 
To be honest, I sometimes find much more joy in tweaking and benchmarking games than playing them. That's part of why I bought a laptop instead of a desktop. The joy of running Control's "CORRIDOR OF DOOM" at 60 FPS with Raytracing on at WQHD (using DLSS of course) with very little compromises on my measly 2060 laptop far outweights any fun I've had with the game, it's just boring to me personally. Not to say it's a bad game of course, it's just not my cup of tea. Same with other games like Metro Exodus Enhanced Edition. While 300 watts desktop cards waste their performance at max settings and native resolution, my 2060 laptop reaches very similar experiences using clever techniques like DLSS and careful settings tweaking (decrease settings that cost much but don't contribute a lot to the overall graphics, enable settings that do make a striking difference such as Raytracing but keeping them at high or medium to save as much performance as possible etc) It's just fascinating to me.

There are exceptions of course, you know, games I actually play. Death Stranding, Psychonauts 2 and now God of War, these are excellent games imo. Wish they had Raytracing.
 
So I'm not the only one? :D :D

In my much younger years, I honesty did find joy in the tweaking and benchmarking of games. Given how long most of us have been on this forum, I'm far from the only one who started their game tweaking days by brutally optimizing config.sys and autoexec.bat. Sure any dipweed could drop himem.sys, emm386.exe /noems and a dos=high,umb into config.sys and pick up a decent 580kb of low memory. Then you'd get the intermediate folks who would build boot options into config.sys which also permitted branching in autoexec.bat so you could pick-and-choose for which drivers made sense to actually load. Then you got us voodoo ninja badasses who would override emm386.exe's attempt at autodiscovering empty upper memory (640k -> 1mb region) blocks, forcing the inclusion of upper memory regions where (hopefully!) the various bioses weren't shadowed into which inevitably would crater your machine (i=b800-bfff x=c000-c7ff i=d000-d255) Then the real massochists would reorient the load order of DOS drivers to optimize how they all fit into the newly recovered upper memory region space to exactly get smartdrv crammed in there alongside ansi.sys and mouse.drv and oakcdrom.sys and mscdex, you could even loadhigh the entire command.com if you were balsy.

Remember when overclocking was a static-speed soldered crystal, eventually augented by jumpers to select from several base speeds with divider jumpers for ISA and VLB speeds, only eventually to finally start permitting clock multiplier jumpers? I remember the very first 83MHz capable motherboard I owned and I ran my Pentium 166MMHz with a 3x multiplier like a badass -- it took me days to get it perfectly stable, and then like a month later I cooked a cheap PCI network card because the PCI bus was running at 41.6MHz lol.

Thirty years later, I'm not there anymore. Yeah I run an overclocked rig, yeah also spent like 30 minutes testing it and then started turning it down a little each time I found a game that crashed. No matter that I'm too old for that shit, I'm never going to begrudge anyone who wants to spend hours, days and even weeks tweaking their rig over and over, benchmarking the same dumb shit over and over, just to finally get that last little half-perecent performance increase they stumbled into to now remain stable.

To those who find joy in benchmarking and tweaking instead of playing: good for you. No small part of where I've landed professionally came from those days doing that same shit.

In the console section these optional things and windows are seen as a curse. It depends on what usergroup your approaching.
 
To be honest, I sometimes find much more joy in tweaking and benchmarking games than playing them. That's part of why I bought a laptop instead of a desktop. The joy of running Control's "CORRIDOR OF DOOM" at 60 FPS with Raytracing on at WQHD (using DLSS of course) with very little compromises on my measly 2060 laptop far outweights any fun I've had with the game, it's just boring to me personally. Not to say it's a bad game of course, it's just not my cup of tea. Same with other games like Metro Exodus Enhanced Edition. While 300 watts desktop cards waste their performance at max settings and native resolution, my 2060 laptop reaches very similar experiences using clever techniques like DLSS and careful settings tweaking (decrease settings that cost much but don't contribute a lot to the overall graphics, enable settings that do make a striking difference such as Raytracing but keeping them at high or medium to save as much performance as possible etc) It's just fascinating to me.

There are exceptions of course, you know, games I actually play. Death Stranding, Psychonauts 2 and now God of War, these are excellent games imo. Wish they had Raytracing.

Know exactly what you mean by the tweaking and testing being more fun than the actual gaming.

I prefer desktops though because you can do everything you described above, then change a component, and do it all again :D
 
The modder responsible for the God of War hack that removes DLSS sharpening has figured out the values for Doom Eternal, which reportedly after the Nov patch was still applying DLSS sharpening even if the in-game sharpening slider was set to 0.

What was interesting in this thread though was that they were attributing instances of DLSS artifacts to this sharpening, or at least in recent games. Basically the issue, at least from their perspective, is that not only is DLSS in some games applying too much sharpening, but also that Nvidia's sharpening algorithm in DLSS is particularly poor - disabling DLSS sharpening but adding in sharpening via Reshade using CAS produces a far better result apparently.

There's good sharpening algorithms (I for one like CAS in ReShade) then there's bad. The one that gets applied in the DLSS sharpening pass is absolutely horrible. What makes it even worse is the fact that it seems to be motion based - the more you move the stronger the sharpening gets, creating halo'ing and ghosting trails and just makes the image look fuzzy. When you stand still, it's very mild in comparison. That's why I think some people don't notice it.

My suggestion is to remove the DLSS sharpening and add Reshade with its CAS filter to counter the blurriness (or use the one in the NVCPL/Freestyle, although it's IMO inferior to CAS). This will look a hundred times better than that DLSS sharpening.
 
And much faster, laptops are great, i have a 3080m equipped one but my 2018 - 2080Ti is still faster for most things.... Though i have to say laptops have gotten much and much better in performance, acoustics, overall quality etc.
Something to test next time you're playing on the laptop: use the Windows power profile settings to configure Processor -> maximum frequency to 99% instead of 100%. This disables the turbo function on modern CPUs, which (on a laptop) means you have a ton more thermal headroom for the GPU to maintain clocks. With a ton of testing, I've discovered basically NONE of my Steam game library needs CPU power in excess of the base clock of the i7-8750H in that rig. The difference in how well my (paltry in comparison to yours) 1070 MaxQ holds clocks is astounding when the CPU turbo is disabled.
 
Microsoft Flight Simulator is getting DLSS and DirectX 12 improvements with future updates | OC3D News (overclock3d.net)
January 27, 2022
As part of the company's latest developer Q&A live stream, developers at Asobo has confirmed that the company is working to add support for Nvidia's DLSS technology to Microsoft Flight Simulator's DirectX 12 mode. Nvidia's DLSS tech is an AI upscaling technique for RTX-series GPU users that has the potential to deliver players dramatic performance uplift in Microsoft Flight Simulator.

The addition of DLSS to Microsoft Flight Simulator should make the game much easier to run at high resolutions and framerates on supported hardware, though it is worth noting that there currently appears to be no plan to add DLSS support for Microsoft Flight Simulator's DirectX 11 codepath. Currently, DirectX 11 provides the most stable framerates in Microsoft Flight Simulator, which means that DLSS will likely not get added to Microsoft Flight Simulator until after DirectX 12's performance and framerate stability is improved.
...
Asobo currently have in-development builds of Microsoft Flight Simulator that already have DLSS integrated, and Asobo are reportedly pleased with the technology. Asobo has not currently discussed integrating alternative upscaling technologies like AMD's FideltyFX Super Resolution or Intel's XeSS AI upscaler.
 
God of War came out on PC earlier this month and Santa Monica and Jetpack have already released four patches for it. The teams have just released a brand new update that adds support for DLSS Sharpening slider, and fixes a number of issues. Going into more details, Update 1.0.4 fixes some rare instances of graphics driver crashes. It also fixes an issue with incorrect VRAM detection on Intel XE platform. Additionally, it resolves a crash that could occur at client shutdown. As always, Steam will download this update the next time you launch its client.

God of War Patch 1.0.4 Release Notes
Fixes

  • Atreus will now reset his state during restart from checkpoint or saved game should he become unresponsive.
  • Fixed some rare instances of graphics driver crashes.
  • Fixed an issue with incorrect VRAM detection on Intel XE platform.
  • Also fixed an issue where the display mode setting would visually set to windowed mode when resetting display settings to default on an ultrawide monitor.
  • Fixed an issue where control functionality would be lost if opening inventory during realm travel sequence.
  • Fixed a crash that could occur at client shutdown.
Features
  • Added support for DLSS Sharpening slider.
Other Changes
  • Added additional logging to crash reports to help identify root causes of intermittent crashes.
 
Btw any idea why using DLSS results in lower resolution reflections?

I experienced it on cyberpunk and bright memory infinite
 
Back
Top