Nvidia Geforce Drivers Release Announcement thread

That's amazing for 720p. Is the ghosting under control?

At 1440p DLSS Performance I think you could reasonably run path tracing on a 4070. Add on FG which I've found works fine at 45-50fps base and you could have a pretty decent experience. Wouldn't be good for competitive shooters but I don't think those games have PT anyway.

I've only done a really quick look, so not sure about ghosting. There wasn't any noticeable ghosting on those particles around the gemstone in Remnant 2, and those are constantly moving. Didn't see anything that stood out for me in Remnant but wasn't look for it. I do generally adjust my settings to get 120 fps or more, and that really helps minimize issues with DLSS anyway.
 

I'm only half way through the video and already very confused. Is the new FG model used on Ada at all? The VRAM saving would be most welcome but I'm not sure if Ada stays with the old model or gets the new model but capped at 2X FG.
 
I'm only half way through the video and already very confused. Is the new FG model used on Ada at all? The VRAM saving would be most welcome but I'm not sure if Ada stays with the old model or gets the new model but capped at 2X FG.

The frame gen model is updated on Ada as well according to Nvidia.

 

I'm only half way through the video and already very confused. Is the new FG model used on Ada at all? The VRAM saving would be most welcome but I'm not sure if Ada stays with the old model or gets the new model but capped at 2X FG.

This might be the worst video they've ever done. I just finished watching. He spends most of the time memeing and intentionally making things more confusing than they actually are. It's like listening to a person complain about their job for 40 minutes. He could have made a good video, but he spent most of it griping.

40 series gets the enhanced version of the old FG (2x) that reduced memory usage and increased "performance." It is the version for the optical flow accelerator. MFG is exclusive to the 50 series.

1738472636094.png
 
This might be the worst video they've ever done. I just finished watching. He spends most of the time memeing and intentionally making things more confusing than they actually are. It's like listening to a person complain about their job for 40 minutes. He could have made a good video, but he spent most of it griping.

40 series gets the enhanced version of the old FG (2x) that reduced memory usage and increased "performance." It is the version for the optical flow accelerator. MFG is exclusive to the 50 series.

View attachment 13010
I should've watched for another 60 seconds and my question would've been answered 😆

The video was very annoying to watch. Like the script was written by a budget Leslie Nielsen.
 
I should've watched for another 60 seconds and my question would've been answered 😆

The video was very annoying to watch. Like the script was written by a budget Leslie Nielsen.
The very first sentence of the video is 'Nvidia is being a dick'. Not the most auspicious start. But ok at least he's not leaving any uncertainty about where he stands. What makes the video so unpleasant is that he then proceeds to be a dick for pretty much the rest of it. It's just exhausting.

And it's a shame because there's a really good point there about branching off an obsolete version of PresentMon.

I find it ironic how people keep bringing up this complaint that they can't tell how many frames are real and how many are fake. I mean, sure. But also, that pretty much means it is working, right?
 
@Florin The whole thing about PresentMon is interesting, but there are good reasons why a company might want to branch off of an open tool like PresentMon, especially during R&D, and he explores none of them. He mostly seems mad that it makes his job more difficult because it makes it harder to do apples-to-apples comparisons, which is something a consumer doesn't need to do. Most people are not running multiple PCs and trying to compare results across them. Also, the stuff about the review guide is a bit weird, because he acts like the information isn't necessary because intel already solved it in PresentMon, but the whole point of review guides is to inform reviewers. Steve himself shows that he had to personally educate JayzTwoCents, so clearly the reviewing community had not gotten the memo, which means the information in the review guide is fully necessary. Just because GN knows doesn't mean everyone else knows.

Would it be better if Nvidia participated in PresentMon? Absolutely. Honestly think it would help if PresentMon wasn't Intel branded. I don't think companies should care about that stuff, but they do.
1738477091441.png
 
Last edited:
I was just playing with DLSS in Witcher 3. I set the app to override with the latest preset. Now I'm not 100% sure it was doing what it is supposed to do (kind of hard to tell) so take this for what it is.

At 1440p I think DLSS Performance is the sweet spot for me. It's a bit hard to spot the quality difference between Performance and Quality. I can tell that DLAA is superior but the perf hit is massive between DLSS Performance and DLAA, not even close to a worthy trade.

The main limitations I noticed even with the new DLSS model is disocclusion artifacts around Geralt's head and a bit of ghosting in the grass. This was noticeable on any DLSS preset. So not perfect but with DLSS Performance the game runs very well completely maxed out on the 4070.

BTW I tested FG (still old model I think) and couldn't see any artifacts from it. All the artifacts I noticed were still present when FG was turned off. But the difference between 50fps and 90fps is very noticeable...
 
I'm at a poor spot in Alan Wake 2 to do a lot of testing, but out of curiosity I decided to take a look at something. This is DLSS Ultra Performance at 1440p. That's 480p internal. The settings are kind of all over the place, no ray tracing. I went out in the rain, where it's dark and and there are some bright contrasting lights. There are some artifacts, but honestly you could easily play this way. Setting the post processing setting to high, so that it's done after upscale, got rid of some problems.

Alan Wake 2 2025-02-03 10_10_05 AM.jpg
 
I'm at a poor spot in Alan Wake 2 to do a lot of testing, but out of curiosity I decided to take a look at something. This is DLSS Ultra Performance at 1440p. That's 480p internal. The settings are kind of all over the place, no ray tracing. I went out in the rain, where it's dark and and there are some bright contrasting lights. There are some artifacts, but honestly you could easily play this way. Setting the post processing setting to high, so that it's done after upscale, got rid of some problems.

View attachment 13016
Frankly I would not believe this if I hadn't already messed around with it myself. It seems too good to be true.

I'd like to think the Switch 2 will benefit massively from this. But knowing Nintendo it wouldn't surprise me if they never bother implementing the new model.
 
yeah agreed, but how it looks in motion is also very important

There is artifacting for sure. I don't know how all the chapter select stuff works in this game and don't want to mess up my save. The forest areas could be a total mess or something like that. I'm more impressed that at least in some areas it seems to work well.
 
There is artifacting for sure. I don't know how all the chapter select stuff works in this game and don't want to mess up my save. The forest areas could be a total mess or something like that. I'm more impressed that at least in some areas it seems to work well.
Foliage does seem to be more difficult. Most notable artifacting I saw was disocclusion weirdness from when the grass emerges from behind Geralt. It leaves a dark trail in the grass around Geralt which clears up after a few frames. Also the "fizzle" around Geralt's head is immediately visible. But subjectively these artifacts didn't seem worse on Performance than they did on Quality, so I think I'm a Performance man now :). If I deal with these artifacts then the game goes from completely unplayable at native TAAU to completely smooth at DLSS Performance.
 
Foliage does seem to be more difficult. Most notable artifacting I saw was disocclusion weirdness from when the grass emerges from behind Geralt. It leaves a dark trail in the grass around Geralt which clears up after a few frames. Also the "fizzle" around Geralt's head is immediately visible. But subjectively these artifacts didn't seem worse on Performance than they did on Quality, so I think I'm a Performance man now :). If I deal with these artifacts then the game goes from completely unplayable at native TAAU to completely smooth at DLSS Performance.

I'm definitely going to use performance in a lot of games to keep my fps high. One way to stretch the life of this gpu out.

Honestly, I may play around in ultra performance with some games that support ray tracing.

Edit: I'm going to try to get some time in on Alan Wake and see if I can get to a more interesting area.
 
Back
Top