Digital Foundry Article Technical Discussion [2025]

The DLSS glazing is wild since they both upscalers look bad in this game. While one might be better in certain aspects than others, the end result is that it still looks bad. The problem with all ML upscalers is that for the most part, when fed low input resolutions, they all struggle. This is true regardless of DLSS, PSSR, or XESS version in my experience. Even with DLSS4, it's artifact city based on their own videos and seems rather easy to break. I had to stop keeping track of the artifacts I noticed and it wasn't just their videos either. Other DLSS4 videos from other content producers have similar artifacts.

Alex questioning the purpose of "PSSR"? it's actually very logical. Having your own technology means that you can remain vendor agnostic. If they decided to switch to another vendor for a new console, their games won't have a software dependency to a proprietary upscaler which they'd need to license. I mean, it's pretty obvious that I'm surprised the question was even asked at all. That combined with the fact that AMD still has nothing. They're still trying to beta test fsr4 as we speak....

Finally the comment about not having to discuss DLSS issues on pc is head scratching since DLSS in its current iteration on pc is artifact ridden when fed low input resolutions. Even in their Nvidia preview video with all the DLSS4 glazing, they pointed out noticeable flaws with DLSS3 that were fixed with the new CNN. So to make that comment, quite head scratching indeed....
 
Last edited:
Bryan and Malcom are fantastic spokesmen for Nvidia, they very much remind me of listening to Carmack. Their enthusiasm for their work shines through and it just gives me an energy to keep watching even when I have little knowledge of thermodynamics or machine learning.

Must have been a cool moment for DF when Bryan had watched Rich's video and new exactly what Alex was talking about.
Thanks for the interview Alex, I am incredibly curious about what you possibly spoke about off camera though ;)

Comments on his hair increase each video, you know you have to ask him if he has used AI to optimize his hair care regiment next time right ;)
 
The DLSS glazing is wild since they both upscalers look bad in this game. While one might be better in certain aspects than others, the end result is that it still looks bad. The problem with all ML upscalers is that for the most part, when fed low input resolutions, they all struggle. This is true regardless of DLSS, PSSR, or XESS version in my experience. Even with DLSS4, it's artifact city based on their own videos and seems rather easy to break. I had to stop keeping track of the artifacts I noticed and it wasn't just their videos either. Other DLSS4 videos from other content producers have similar artifacts.

Alex questioning the purpose of "PSSR"? it's actually very logical. Having your own technology means that you can remain vendor agnostic. If they decided to switch to another vendor for a new console, their games won't have a software dependency to a proprietary upscaler which they'd need to license. I mean, it's pretty obvious that I'm surprised the question was even asked at all. That combined with the fact that AMD still has nothing. They're still trying to beta test fsr4 as we speak....

Finally the comment about not having to discuss DLSS issues on pc is head scratching since DLSS in its current iteration on pc is artifact ridden when fed low input resolutions. Even in their Nvidia preview video with all the DLSS4 glazing, they pointed out noticeable flaws with DLSS3 that were fixed with the new CNN. So to make that comment, quite head scratching indeed....
Agree with some of what you're saying, but when you say these are 'bad', we're not really getting to see the alternative which would be these games at a blurrier native res. Yes, these upscalers of any type(including AI-based ones) fare worse the lower resolution you go, but in most all cases, the end result is still usually preferable to using native res. I know some people may say there's still fine with native 1080p and whatnot, but I dont think most people with next gen-capable hardware really are anymore. Definitely not in these more presentation-heavy AAA games.
 
Bryan and Malcom are fantastic spokesmen for Nvidia, they very much remind me of listening to Carmack. Their enthusiasm for their work shines through and it just gives me an energy to keep watching even when I have little knowledge of thermodynamics or machine learning.

Must have been a cool moment for DF when Bryan had watched Rich's video and new exactly what Alex was talking about.
Thanks for the interview Alex, I am incredibly curious about what you possibly spoke about off camera though ;)

Comments on his hair increase each video, you know you have to ask him if he has used AI to optimize his hair care regiment next time right ;)
I was super impressed with Bryan in this interview. He gave answers that were both informative and diplomatic. Usually you can get one of those but not both.
 
Agree with some of what you're saying, but when you say these are 'bad', we're not really getting to see the alternative which would be these games at a blurrier native res. Yes, these upscalers of any type(including AI-based ones) fare worse the lower resolution you go, but in most all cases, the end result is still usually preferable to using native res. I know some people may say there's still fine with native 1080p and whatnot, but I dont think most people with next gen-capable hardware really are anymore. Definitely not in these more presentation-heavy AAA games.

Great point. So many comparisons are made between 4K upscaled and 4K native while ignoring the performance and cost implications. A more relevant comparison is 1440p vs 1440p upscaled to 4K as that’s the real decision facing an end user with a fixed GPU budget.
 
I went from a 1080p monitor to 1440p recently. DLAA at 1080p looks worse than DLSS Quality at 1440. Significantly worse IMO. Even 1440p DLSS balanced looks better.
 
well spoken guy and very chill to talk to, nVidia is in great hands.

My favourite sentence of the video: "DLSS has something for everybody". And even most interesting is that the older games that featured FG via DLSS, now can run with DLSS4 MFG without even changing the game at all!

I went from a 1080p monitor to 1440p recently. DLAA at 1080p looks worse than DLSS Quality at 1440. Significantly worse IMO. Even 1440p DLSS balanced looks better.
heheheh, just went the other way around, got a 360Hz 1080p monitor for motion clarity. I should have it at home by wednesday might share my impressions.

But now with nVidia MFG and apps like LS, there isn't going to be a distinction between hight Hz esports monitors and regular monitors. Single player games can finally benefit a LOT in the motion clarity department with the 240Hz, 360Hz, 500Hz, 750Hz etc, displays.

And they talk in the video about monitors with thousands of Hz. 😁 Crazy.

I like the guy's hair. Dunno it it's modelled with AI, but it's like L'oreal, "because you are worth it". xD
 
Last edited:
The DLSS glazing is wild since they both upscalers look bad in this game. While one might be better in certain aspects than others, the end result is that it still looks bad. The problem with all ML upscalers is that for the most part, when fed low input resolutions, they all struggle. This is true regardless of DLSS, PSSR, or XESS version in my experience. Even with DLSS4, it's artifact city based on their own videos and seems rather easy to break. I had to stop keeping track of the artifacts I noticed and it wasn't just their videos either. Other DLSS4 videos from other content producers have similar artifacts.

Alex questioning the purpose of "PSSR"? it's actually very logical. Having your own technology means that you can remain vendor agnostic. If they decided to switch to another vendor for a new console, their games won't have a software dependency to a proprietary upscaler which they'd need to license. I mean, it's pretty obvious that I'm surprised the question was even asked at all. That combined with the fact that AMD still has nothing. They're still trying to beta test fsr4 as we speak....

Finally the comment about not having to discuss DLSS issues on pc is head scratching since DLSS in its current iteration on pc is artifact ridden when fed low input resolutions. Even in their Nvidia preview video with all the DLSS4 glazing, they pointed out noticeable flaws with DLSS3 that were fixed with the new CNN. So to make that comment, quite head scratching indeed....
I dont see how Oliver can choose DLSS over PSSR in this game when DLSS is that blurry. It looks awful.
 
That interview left me with the impression that NVIDIA's software is waiting for its hardware to catch up and has been for a while.
well, maybe they don't care. If you think about it nVidia has just deleted the low-end market. Now with FG, the low-end market ceases to exist, any GPU can run games at decent framerates.

If you buy a mid to low-end Intel B570 you won't be missing out on the cutting edge graphics thanks to nVidia inventing FG some years ago and how it's evolving.
 
I dont see how Oliver can choose DLSS over PSSR in this game when DLSS is that blurry. It looks awful.
I didn't want to outright say it but, his conclusion does not follow from the evidence provided. Like I said, the DLSS glazing is out of hand. All versions of DLSS I've seen have varying problems even with different games. Even the new cnn that Nvidia hyping has issues. So until DLSS has no issues, for me, the glazing is completely unwarranted. Frankly, I think people need to speak factually about DLSS instead of letting their emotions get the better of them.

DLSS is a fantastic technology to aid and cover-up the problems introduced by TAA. Nothing more, nothing less. Remove TAA and the problems of TAA are exchanged for different issues which DLSS cannot address.
 
well, maybe they don't care. If you think about it nVidia has just deleted the low-end market. Now with FG, the low-end market ceases to exist, any GPU can run games at decent framerates.
No. This is precisely where people get tripped up in their optimism over the high end demos and performance claims.
1) Frame generation is great because it lets me take 60hz-100hz up to 240-480hz or whatever that my monitor can do is a accurate statement.
2) Frame gen is great because it can take my low end GPU that runs at 20hz and make it look and feel great at 60 is not correct; not even NVIDIA is making that claim.

Frame gen is still a high end feature for bringing more smoothness to games that already run decently well. How much you value that additional smoothness of course will vary, but no one is arguing against the existence of benefits in those cases. But unlike in the past, this benefit does *not* apply to lower end cases. Whatever card you get has to already be able to run decently fast the settings you care about, so it really does nothing for the low end (again, even assuming frame gen itself was free and perfect quality).

This is not saying that frame gen is bad, it just does not scale down. This is even discussed in the interview - even ignoring feel, all of these techniques are still fundamentally limited by the amount of variation in the input data. Once motion across the base samples gets too large, there's no way to properly reconstruct. i.e. just like with upscaling, having a reasonably good base frame rate will always be important.
 
I didn't want to outright say it but, his conclusion does not follow from the evidence provided. Like I said, the DLSS glazing is out of hand. All versions of DLSS I've seen have varying problems even with different games. Even the new cnn that Nvidia hyping has issues. So until DLSS has no issues, for me, the glazing is completely unwarranted.

Well no reconstruction tech will ever have 'zero issues', as one of the key metrics to determine the final output is input resolution. DLSS performance mode with an output res of 1080p is something wholly different than DLSS performance mode at an output of 4k, all depends on your criteria.

People can appreciate a feature when it delivers a result that is more cost effective than another method, not so much that it's 'flawless'. I myself have critiqued DLSS in certain games when I first got my 3060, there were definite artifacts that I felt weren't being highlighted, but in general it's gotten quite a bit better - not just from Nvidia's side, but developers as well who are better at looking out for those post-process bugaboos that were more common in earlier games, especially ones that were patched to add DLSS later and it fucks up with motion blur/dof.

Frankly, I think people need to speak factually about DLSS instead of letting their emotions get the better of them.

Oliver does not strike me as someone who often lets his 'emotions get the better of him' in general, but as well he's primarily a Mac and console guy - why would he want to 'glaze' DLSS? He specifically highlights sections where he feels PSSR outperforms DLSS, and you can see how the two can differ greatly from scene to scene. His final conclusion that DLSS is still superior in this title may not entirely match up with the clips provided, but considering that he talked about the issues with both PSSR and DLSS in detail and his history, I see no reason to imply he's being deliberately dishonest. For what purpose?

Remove TAA and the problems of TAA are exchanged for different issues which DLSS cannot address.

Nothing can really resolve the issues of non-temporal AA solutions without excessive brute force. TAA was introduced to deal with all the shader aliasing that modern games would have otherwise. So yes, of course DLSS wouldn't exist without TAA, and TAA wouldn't exist if there was actually a performant way to deal with shader aliasing.
 
Back
Top