Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

Its nice isnt it. Since summer 2021 ive had my G15 (15.6'', RTX3070m/5800h/32gb/1tb nvme and 144hz panel), handles everything i throw at it, its quite close to my 2080Ti desktop which is something i didnt expect. Its not so noisy either, doesnt throttle even in hot summer temps and the panel is quite good considering the price (1600/1700usd). The little machine outdoes the PS5 in Spiderman, and it should for the specs, but still, its a laptop. We have come long ways in the world of laptops.
Yes, laptops are pretty cool. The thought of having desktop level performance in a tiny case really fascinates me. Plus, laptops are way more energy efficient which is better for the environment and doesn't heat up the room as much. On my 2060 laptop, with a little tweaking, I can hook it up to my WQHD monitor and get a stable 1440p60 experience with DLSS Performance and Raytracing in most games I've played which is very nice. That's indeed a great level of versatility.

Now if they would only get rid of Optimus ( keep battery life of course) and instead make an entire high perf SoC like the Apple M1... I can tell you, wiring the HDMI port to the dGPU was a very bad idea if you switch between your internal screen and external monitor constantly. Thankfully Windows 11 and Legion Toolkit allows me to have a good experience now. Legion Toolkit basically disables and enables the dGPU when I disconnect the AC adapter and external monitor, so the GPU won't stay awake killing battery life (who at Nvidia and Intel designed this crap?) Before Windows 11, entire Windows would stutter heavily when I would connect my monitor. That is gone now too with Windows 11.

DLSS requiring drivers 512.15 and above makes me sad. Because every driver after 511.23 can make my system crash when I let my external monitor go to sleep for a while and try to wake it up, not frequent of course but it can happen... Sigh. Not everything is rosy in the laptop world.
 
Yes, laptops are pretty cool. The thought of having desktop level performance in a tiny case really fascinates me. Plus, laptops are way more energy efficient which is better for the environment and doesn't heat up the room as much. On my 2060 laptop, with a little tweaking, I can hook it up to my WQHD monitor and get a stable 1440p60 experience with DLSS Performance and Raytracing in most games I've played which is very nice. That's indeed a great level of versatility.

Now if they would only get rid of Optimus ( keep battery life of course) and instead make an entire high perf SoC like the Apple M1... I can tell you, wiring the HDMI port to the dGPU was a very bad idea if you switch between your internal screen and external monitor constantly. Thankfully Windows 11 and Legion Toolkit allows me to have a good experience now. Legion Toolkit basically disables and enables the dGPU when I disconnect the AC adapter and external monitor, so the GPU won't stay awake killing battery life (who at Nvidia and Intel designed this crap?) Before Windows 11, entire Windows would stutter heavily when I would connect my monitor. That is gone now too with Windows 11.

DLSS requiring drivers 512.15 and above makes me sad. Because every driver after 511.23 can make my system crash when I let my external monitor go to sleep for a while and try to wake it up, not frequent of course but it can happen... Sigh. Not everything is rosy in the laptop world.

Laptops have made great strides in the last decade, in special the last four or so years. Were not quite there yet but gaming on a laptop isnt a bad idea anymore. An APU like in the consoles could be something (higher performance APU) but i think these issues should be able to be fixed without one aswell in special the software side.
Only downside with my G15 is that i cant undervolt the 5800h, not that its really needed since it doesnt throttle (nor the gpu) but lower temps/efficiency is always welcome.

I think Intel laptops can be undervolted.
 
This made me realise that a Switch 2 with DLSS has a lot of potential. Granted that my laptop is still consuming more power than a Switch does, especially the Ryzen 5800H CPU (albeit clocked at barely 1Ghz) and the GPU alone was consuming 25-30W. However, we've already seen impossible ports to the current Switch, so I'm confident a new one with DLSS Performance in Portable mode would be amazing.
 
This made me realise that a Switch 2 with DLSS has a lot of potential. Granted that my laptop is still consuming more power than a Switch does, especially the Ryzen 5800H CPU (albeit clocked at barely 1Ghz) and the GPU alone was consuming 25-30W. However, we've already seen impossible ports to the current Switch, so I'm confident a new one with DLSS Performance in Portable mode would be amazing.

It's certainly a balance as the smaller screen can hide the artifacts that reconstruction can bring, but also the lower the starting res, the more reconstruction tech struggles. I mean DLSS performance mode at 4k output can be problematic in some games, and in a Switch we're talking what - 480p (or lower) reconstructed up to 720p?

With a closed system there can be more opportunity perhaps to fine-tune the implementation, I think part of the reason some games have DLSS pitfalls on the PC is just due to not paying attention to the edge cases with certain effects that don't scale. But I just think people are perhaps extrapolating DLSS performance figures on the PC and thinking "Well, we could get double the performance if Switch had DLSS!" and not really thinking of how well DLSS would actually work with a ~480p input res. Base on my experience and seeing how well DLSS works with lower than 4K, I'm suspect that it would actually provide a significant benefit over just bilinear scaling. It could be 'sharper' yes, but with such a low starting res the chances for DLSS interpreting something inaccurately goes up dramatically.
 
It's certainly a balance as the smaller screen can hide the artifacts that reconstruction can bring, but also the lower the starting res, the more reconstruction tech struggles. I mean DLSS performance mode at 4k output can be problematic in some games, and in a Switch we're talking what - 480p (or lower) reconstructed up to 720p?

With a closed system there can be more opportunity perhaps to fine-tune the implementation, I think part of the reason some games have DLSS pitfalls on the PC is just due to not paying attention to the edge cases with certain effects that don't scale. But I just think people are perhaps extrapolating DLSS performance figures on the PC and thinking "Well, we could get double the performance if Switch had DLSS!" and not really thinking of how well DLSS would actually work with a ~480p input res. Base on my experience and seeing how well DLSS works with lower than 4K, I'm suspect that it would actually provide a significant benefit over just bilinear scaling. It could be 'sharper' yes, but with such a low starting res the chances for DLSS interpreting something inaccurately goes up dramatically.

That's true, but a Switch 2 should probably be able to have at least a 720p input resolution. Demanding games with dynamic resolution on the current Switch scale from 320p to 720p, with 540p as common resolution. Why wouldn't a Switch 2 with Ampere based hardware have 720p input already?
 
That's true, but a Switch 2 should probably be able to have at least a 720p input resolution. Demanding games with dynamic resolution on the current Switch scale from 320p to 720p, with 540p as common resolution. Why wouldn't a Switch 2 with Ampere based hardware have 720p input already?

Right. If we assume the Switch 2 uses the 6 or 8 core Orin NX with 1024 Ampere CUDA Cores with ~100 GB/s memory , it shouldn't have much trouble hitting a 720p native input resolution on just about any game out today. I suppose one question is how well would DLSS with a 720p input resolution scale across the 32 or so Tensor Cores that the Orin NX has to output a 1080p image.
 
So having finally acquired an Ampere GPU, I'm checking out Cyberpunk on my 2nd run with RT on Ultra and DLSS Quality. It's using an updated DLSS 2.3.x. Since I was limited to screenshots and some videos before, really wanted to see what it was like myself finally and it's very impressive. It allows me to run everything maxed in the game (apart from RT Psycho...) and keep my GPU usage below max to maintain a lower heat output, but also provide superior IQ than "native".

There was a clearly noticeable issue with flashing lights when DLSS was on, which was rather annoying but not game breaking. After some research I found out this was due to a lackluster implementation from CDPR for DLSS where they didn't have enough precision on the Sharpness filter value, which could be fixed with specific settings in an ini file and now it works fine.
 
The effect is subtle and screenshots don't do it justice, but in motion it gives off a very natural glow all around you and you'll find it hard to turn it back down to Ultra.

When using the top-end Psycho option for ray traced lighting, full global illumination is introduced which simulates light scattering as it plays off each surface. That means, photons of light bounce off these surfaces transmitting colour information to a second surface - it absorbs and transmits this colour creating a more natural, realistic scene. It simply enhances the already excellent ray traced lighting features.

https://www.eurogamer.net/digitalfoundry-2020-cyberpunk-2077-high-end-pc-tech-analysis
 
The team has added a new "DLSS" section in the Advanced Tab of GPU-Z that locates all of the installed games on your system that support NVIDIA DLSS technology and tells you which version of DLSS they're using. That's a super-awesome little addition to an already daily piece of software for many people, myself included.

GPU-Z 2.48.0 changelog

Added new "DLSS" section to Advanced Tab, which will locate all installed games with DLSS support and report their DLSS version
GPU-Z will no longer send traffic to www.techpowerup.com and uses www.gpu-z.com exclusively, which makes it easier for IT administrators to block traffic originating from GPU-Z. All previous endpoints on techpowerup.com will be disabled soon, please update your firewall rules accordingly
When an NVIDIA Engineering Sample GPU is installed, GPU-Z will block all network activity (feature request by NVIDIA)
Many improvements to Intel Arc detection, sensors, reporting and specs
Renamed Intel discrete GPU power sensor to "GPU Chip Power Draw" to clarify that it does not measure whole board power, but GPU chip power only
Improvements to Chinese translation
Added detection for Advantech vendor Id
Fixed fan speed monitoring on Intel DG1 with newer drivers
Fixed RTX 3080 12 GB release year
Fixed Ryzen 5800H release date
Fixed RV670 die size
Added support for NVIDIA GeForce RTX 3050 OEM, MX550 (TU117-A), RTX A5500, A5500 Mobile, A4500 Mobile, A3000 12 GB Mobile, A1000 Embedded
Added support for Intel Core i5-1230U, several new Arc SKUs
Added support for AMD FireStream 9170
 
Hopefully Nvidia will take DLSS to the next level instead of relying on last gen sharpening techniques to improve visual fidelity.
That too is a good take. I still believe DLSS was fine without sharpening, I hate it that some games force it, despite SDK saying they should add a toggle/slider. Then some devs actually do add a slider, but using %0 does not disable it. It's a mess. Dev DLSS works but why should we rely on it...
 
Interesting. Seems like something went wrong with DLSS in CP2077 1.6 update

Look at a hobo wandering around at a start of a video

 
Back
Top