Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

Now that I finally was able to get an RTX card for less than $500 cndn (3060), I've put DLSS through its paces and while in general I'm happy with it, there is one rather noticeable downside I hope they can improve upon in future releases, or just have devs pay more attention to it - and that is dealing with how it scales with already low-res buffers.

Basically in 3 games now - Horizon Zero Dawn, Shadow of the Tomb Raider and Death Stranding, DLSS has produced very solid results, and in cases like Horizon, better than native TAA with a quality of foliage that basically eliminates shimmering while providing crisp detail. That is, until I was in a scene with a low-lying fog/mist that intersects with the foliage at certain distances with a dof effect, and yikes - big, blotchy flickering pixels. Granted to get 60fps I have to use DLSS performance, but I tried Quality mode and it was still stood our prominently. At native it was basically invisible, and this exaggerated effect wasn't seen with RIS or NIS either - albeit of course, in every other image quality area they're far inferior.

Death Stranding also exhibits this with a similar situation involving a low res buffer. When you first meet the President there's a cutscene where it pans down from the ceiling with a DOF effect overlaid against a series of cables, with DLSS they look like 720p they're so blocky - which stands out even more against how crisp and stable the rest of the image is. With SOTT, in a scene with an underground cavern shining my flashlight, that again produces a depth of field effect, and the result is anything lit in that flaslight is extremely low-res and pixelated.

These can really take you out of the scene when they occur, and also don't occur to nearly this extent with something like checkerboarding or even a res of ~1440p (which is around what SOTT on the PS5 runs at). I'll have to try more games with DLSS to see how common this is, I think there has to be a way for devs not to have DLSS touch certain render targets (?) or this would be a more prominent critique. It just was somewhat surprising to me to see it occur this harshly.
 
Last edited:
It might be related to what version of DLSS you are using.
In GPU forums people are making visual distinctions between different DLSS versions with the most recent version not necessarily the best. Currently 2.3.9 is the preferred version for most games.
 
Just wondering which one did you get? Was it the MSI Ventus 2x? If so what's the fan speed on it during gaming and full loads?
The Ventus 3X actually. I have to play with fan curves and of course your mileage will vary depending on the airflow of your case (mine is this one, very so-so at best for GPU airflow), but by default it's...ok.

It's definitely audible when it gets up to 65% fan speeds (around 2200 rpm) which it will do on sustained loads of 95%+ for a while, seems by default it wants to keep temps below 70c which is par for the course when using the defaults for any cooling equipment in a new PC, they're always going to err on the side of cooling regardless of noise. Not an annoying whine or anything, just a prominent whoosh.

However 55% fan speed is far less audible, and 50% (~1800rpm) is just barely detectable above my 140mm case fans which run at 900rpm, and at 50% it only raises temps by 2 degrees or so to boot. I turned off vsync and did a 10 minute run in Control with RT fully on/DLSS perf to keep GPU usage at 99%, and with a locked 50% fan speed the temp maxxed out at 73c - that's also with a 100mhz OC on the core and 500mhz on memory.

Only drawback is setting a custom fan curve in afterburner disables the auto fan shut off on desktop, 30% is as low as it goes when doing that for some reason - albeit at that speed is basically indistinguishable from being fully off. So certainly more noticeable than my previous Gigabyte 1660 which was basically silent under loads, but haven't even touched undervolting or really playing with fan curves at all. I'd say with this model if you're ok with ~75c in a mediocre GPU airflow case like mine as the ceiling, you can set a max fan speed of 50% and probably almost never hear it.

Definitely check every day for prices. Day after I nabbed this for $498 it was back up to $700 cad.
 
The Ventus 3X actually. I have to play with fan curves and of course your mileage will vary depending on the airflow of your case (mine is this one, very so-so at best for GPU airflow), but by default it's...ok.

I can't actually fit anything but the smallest triple fan GPUs (290mm, eg. EVGA smaller FTWs for example) unless I change cases, there's a longer term plan for this some time down the road but it would need to come after switching CPU/mobo platforms.

Been trying to get an idea of the basic dual fan RTX 3060's compare (and also source a GA104 die one) as they don't tend to be reviewed. It's even hard to find heatsink pictures of some them such as the Ventus 2x, so I don't know for sure if it's actually the same heatsink as one used on the 3060ti or different (possibly less heat pipes).

Only drawback is setting a custom fan curve in afterburner disables the auto fan shut off on desktop, 30% is as low as it goes when doing that for some reason - albeit at that speed is basically indistinguishable from being fully off.

I believe it's an issue apparently with how Nvidia reports fan speeds with the 3000 series and how some AiBs set fan speeds in their bios's. The person who developers Afterburner's has mentioned this on the Guru3D forums thread for Afterburner.

Definitely check every day for prices. Day after I nabbed this for $498 it was back up to $700 cad.

I'm guessing you bought it from Amazon? I actually do have a Ventus 2x ordered for $478 from Amazon.ca. What I was hoping for was the Gaming X to maybe make an appearance in stock at CC, since I'm guessing that heatsink is likely overkill for the 3060. Or maybe if the EVGA goes in stock.

Although 3060 prices have rebounded slightly upwards to start back at $500 this week. We might be hitting the current support level for prices until miners start to sell off and/or next gen releases. But I don't know if I actually want to wait for next gen at this point, since the SKU I'd want (more towards the middle stack) may not even launch until 2023.
 
Zotac do dual fan 3060's (and ti's) also Asus has one the Asus GeForce RTX 3060 Dual V2 OC LHR 12GB

The issue though is the lower models are not commonly sampled and therefore reviewed. Which is a problem I've always had with the review system in that they tend to focus on AiB variants that end up costing so much extra you might as well just move to the next tier GPU up, but that's another subject of discussion.

But the options cropping up here are -

Asus RTX 3060 Dual V2 - Completely unknown, can't find any review or images of heat sink in detail.
EVGA RTX 3060 XC - Most reviewed one. Somewhat aggressive fan curve. 2 heatpipes (but "double" length), metal back plate with actual thermal pad contact for VRAM and GPU.
Gigabyte RTX 3060 Eagle - 4 direct contact heat pipes? Very aggressive fan curve based on 1 review.
MSI RTX 3060 Ventus 2x - 4 direct contact heat pipes? Plastic... I mean "graphene" back plate.
MSI RTX 3060 Gaming X - Best option at a slight $10 premium but I think that might have been a very low stock on off.
Zotac RTX 3060 Twin Edge - 3 heat pipes with heat spreader? Zotac's have problematic warranty terms that affect resale. Zotac's also haven't gone down in price here for some reason and so this would be among the most expensive 3060's.

RTX 3060ti's seem to be suffering from actual street price issues due to public perception. A lot of public seems to just go by initial impression from reviews/word of mouth in that it was good value. However actual street prices for instance are not reflecting that. They're available at 35% if not 40% more than the RTX 3060 (yet sell out faster) which makes the value proportion completely different than the 20% higher price used for initial MSRP comparisons. The other issue personally is I'm not interested in spending that much both nominally an relatively on 8 GB VRAM cards anymore for a variety reasons including for content creation purposes.
 
It might be related to what version of DLSS you are using.
In GPU forums people are making visual distinctions between different DLSS versions with the most recent version not necessarily the best. Currently 2.3.9 is the preferred version for most games.

Nvidia need to make an easy to use tool to change DLSS versions. So no need to copy dll

Or it's already in the Nvidia panel and/or gf experience?


Edit: it's already automatic
I wonder if it also automatically stops itself if the dll in the game exe folder has been manually updated.

Anyway DLSS 240 has been released and people says it have more streaks/blur than 239
 
Last edited:
Nvidia need to make an easy to use tool to change DLSS versions. So no need to copy dll

Or it's already in the Nvidia panel and/or gf experience?

They need profiles so that the latest version contains all of the variations required for best results for each game. What exactly changes between versions anyway? Is it the trained model and how big are those?
 
Nvidia need to make an easy to use tool to change DLSS versions. So no need to copy dll
There is no "need" to change DLLs. Improvements you get from DLL substitution is minor at best.

it's already automatic
Nv does driver side intercepts for some titles (CP2077 was an example) but generally speaking a game is using the DLL it is shipped with. If the developers think it's not good enough they can always push an update.
 
DLSS quality in the Matrix Awakens City Sample looked pretty bad and here's the possible cause that I investigated.

1. Coarse jitter sample pattern.
This makes the scene look blurry. But not a major problem I think.

UE5 City Sample:
UUwKmPm.png


DLSS SDK Sample (good case)
fJkKqT6.png



2. Bad motion vectors. I mean, REALLY BAD especially in thin objects.
Cross-patterned 5 pixels in the air are all birds. You'll be noticed that birds are barely visible although they are flying around the city all time.
VDo1BZt.png


The motion vectors of the thin branches are very chunky and dirty. Compare with actual geometry.
xaE4Pzw.jpg

nOFYgLZ.jpg


This causes the branches to appear disconnected and blink while moving camera.
4YkWmn8.jpg


Also people walking far around have chunky motion vectors too. They always moving with the ghosting if you look closely.
MGAI6VG.png


This is what good motion vector should look like:
gEpn9n4.png



So, who should address this problem?
 
12 More Games Add DLSS, Including F1Ⓡ 22 & HITMAN 3

Each month we work with developers to enhance games with NVIDIA DLSS and other RTX technologies, giving GeForce gamers the definitive experience. To date, over 250 games and applications have added RTX technologies, and at COMPUTEX 2022 we announced that DLSS is headed to another 12 games.

Headlining the new additions are F1Ⓡ 22 and HITMAN 3, which are both introducing support for NVIDIA DLSS and ray-traced effects. DLSS is also coming to or has just launched in Deep Rock Galactic, LEAP, Ghost, Loopmancer, Hydroneer, Propnight, Raji: An Ancient Epic, Vampire: The Masquerade - Swansong, Turbo Sloths, and Warstride Challenges.

https://www.nvidia.com/en-us/geforce/news/geforce-computex-2022-announcements/
https://www.nvidia.com/en-us/geforce/news/computex-2022-rtx-dlss-game-updates/

 
Hitman 3 seem to have a solid DLSS implementation despite using a somewhat outdated SDK of 2.3.2 (2.4.3 is the latest I believe; 2.3.2 is about 9 months old) and omitting user controllable sharpness option, again.
 
Back
Top