Digital Foundry Article Technical Discussion [2025]

Frame gen is a great technology of "secondary" importance.

My problem with it is: higher input lag than with no frame gen. Yes, the difference isn't high, but it's there, so you get to play at 240hz vs 60hz, but technically the responsiveness will be better at 60. And not only that, there is a level of "disconnect" in playing at such high framerates but the expected responsiveness just isn't there.

I also question the need for such high refreshes. I have been passionate about display technology and hardware for a while, but even then sometimes I struggle to see the difference between 60 and 120hz. I know people that struggle to even understand the difference from 30 to 60. This is a niche technology that's valuable to very few people. Still good that it's getting developed.
 
Well no reconstruction tech will ever have 'zero issues', as one of the key metrics to determine the final output is input resolution. DLSS performance mode with an output res of 1080p is something wholly different than DLSS performance mode at an output of 4k, all depends on your criteria.

People can appreciate a feature when it delivers a result that is more cost effective than another method, not so much that it's 'flawless'. I myself have critiqued DLSS in certain games when I first got my 3060, there were definite artifacts that I felt weren't being highlighted, but in general it's gotten quite a bit better - not just from Nvidia's side, but developers as well who are better at looking out for those post-process bugaboos that were more common in earlier games, especially ones that were patched to add DLSS later and it fucks up with motion blur/dof.
Well, sure it might not have zero issues but when we approach a time period where the issues and artifacts aren't easily noticeable, then the glazing can begin. Until then, it's premature and unwarranted. It's fine to appreciate the benefits of DLSS which unfortunately is not what's happening. This hyperbolic praise started when DF claimed that DLSS was better than native and people started parroting it ad nauseum. It wasn't universally true then and it's still not true today. Cleary, there are things DLSS does well but it doesn't do everything well and does many things worse. It's about time to reign in the hyperbole and reclassify that narrative with several addendums.
Oliver does not strike me as someone who often lets his 'emotions get the better of him' in general, but as well he's primarily a Mac and console guy - why would he want to 'glaze' DLSS? He specifically highlights sections where he feels PSSR outperforms DLSS, and you can see how the two can differ greatly from scene to scene. His final conclusion that DLSS is still superior in this title may not entirely match up with the clips provided, but considering that he talked about the issues with both PSSR and DLSS in detail and his history, I see no reason to imply he's being deliberately dishonest. For what purpose?
To quote George from Seinfeld, "It's not a lie if you believe it". I don't think that there's any malicious reason behind his comments. Oliver seems like a stand up dude and I for the most part enjoy his content. I just think it's a case of uttering the DLSS is superior narrative ad nauseum that causes people to arrive at that conclusion even when the data doesn't support it. Also, people weigh issues differently. For some, ghosting is worse than aliasing while for others, the reverse is true.
Nothing can really resolve the issues of non-temporal AA solutions without excessive brute force. TAA was introduced to deal with all the shader aliasing that modern games would have otherwise. So yes, of course DLSS wouldn't exist without TAA, and TAA wouldn't exist if there was actually a performant way to deal with shader aliasing.
And our frames would be free of ghosting and blur. Our textures would be sharp without the need of "artificial sharpening" and 1080p would return to the crispness of old. What a wonderful world that would be.....
 
And our frames would be free of ghosting and blur. Our textures would be sharp without the need of "artificial sharpening" and 1080p would return to the crispness of old. What a wonderful world that would be.....

We had it, and with a number of PC releases like most Nixxes ports, we still do. Like I said before, there wasn't a global TAA cabal that blackmailed developers into nearly universally adopting TAA, they adopted it because it was the most performant solution to deal with the increasing problem of shader aliasing from the use of more advanced materials (along with the performance savings of using temporal upscaling techniques for effects the games rely on).

FXAA/SMAA certainly don't cost more than TAA, so developers aren't choosing 'blurry' over 'sharp'. They're choosing potentially less texture detail for the benefit of a massive reduction in specular/shader aliasing across the entire scene that is not possible without TAA/reconstruction/downsampling.

Here's an example of those 1080p 'crisp' halycon days. Uh, no thanks.
 
The DLSS glazing is wild since they both upscalers look bad in this game. While one might be better in certain aspects than others, the end result is that it still looks bad. The problem with all ML upscalers is that for the most part, when fed low input resolutions, they all struggle. This is true regardless of DLSS, PSSR, or XESS version in my experience. Even with DLSS4, it's artifact city based on their own videos and seems rather easy to break. I had to stop keeping track of the artifacts I noticed and it wasn't just their videos either. Other DLSS4 videos from other content producers have similar artifacts.

Alex questioning the purpose of "PSSR"? it's actually very logical. Having your own technology means that you can remain vendor agnostic. If they decided to switch to another vendor for a new console, their games won't have a software dependency to a proprietary upscaler which they'd need to license. I mean, it's pretty obvious that I'm surprised the question was even asked at all. That combined with the fact that AMD still has nothing. They're still trying to beta test fsr4 as we speak....

Finally the comment about not having to discuss DLSS issues on pc is head scratching since DLSS in its current iteration on pc is artifact ridden when fed low input resolutions. Even in their Nvidia preview video with all the DLSS4 glazing, they pointed out noticeable flaws with DLSS3 that were fixed with the new CNN. So to make that comment, quite head scratching indeed....
Based on the footage in that video, I would say DLSS looks a lot better. It is soft, but the PSSR footage is a noisy mess.
 
hair is of essence to nVidia, now we understand why.

KQuUdMO.png
 
Back
Top