Digital Foundry Article Technical Discussion [2025]

Avowed on PC features strong visuals with a very powerful Hardware Lumen implementation for global illumination/reflections and decent performance.

Great job DF! I will check this out on Game Pass. This was also an excellent example of the Transformer upscaling model helping with image quality.
 
too little of a performance lift, too expensive for a 5070. (in the Pascal days a GTX 1060 6GB did cost 200$ or less)


5060 remains as my only hope, but I guess that 16GB VRAM is out of the cards for that one, so yeah, maybe next generation
 
too little of a performance lift, too expensive for a 5070. (in the Pascal days a GTX 1060 6GB did cost 200$ or less)


5060 remains as my only hope, but I guess that 16GB VRAM is out of the cards for that one, so yeah, maybe next generation
Are you currently using an A770?

I sucked it up a couple years ago and got a $600 4070. TBH it's a mostly good 1440p experience. And while it hurt to pay $600 for something that I'd consider more like a 4060 than a 4070, $600 for this level of performance would be a bargain in the current market :(
 
It’s great that KCD2 runs smoothly on moderate hardware but selfishly I wished they aimed higher on the graphics side. The forested areas look fantastic but the NPCs and man made structures are poor in comparison. It won’t age well.
 
Started avowed. What’s with this weird blur to everything? upclose textures and everything is super sharp. The moment you're a foot away, it's like it's dropping to a different LoD but very quickly. Makes everything have this weird fuzzy look to it. Everything is maxed on a 4090 fwiw.

Just came into this right from finishing Hogwarts so it's pretty jarring.
 
Started avowed. What’s with this weird blur to everything? upclose textures and everything is super sharp. The moment you're a foot away, it's like it's dropping to a different LoD but very quickly. Makes everything have this weird fuzzy look to it. Everything is maxed on a 4090 fwiw.

Just came into this right from finishing Hogwarts so it's pretty jarring.
can you post some screenshots? do it in the avowed thread or something
 
Open the Avowed config file, open the config file for Hogwarts (or other ue5 game) to compare, see if there is anything you can change to get rid of the fuzziness
 
Started avowed. What’s with this weird blur to everything? upclose textures and everything is super sharp. The moment you're a foot away, it's like it's dropping to a different LoD but very quickly. Makes everything have this weird fuzzy look to it. Everything is maxed on a 4090 fwiw.

Just came into this right from finishing Hogwarts so it's pretty jarring.
check your upscaling algorithm and update drivers.

I don't experience that issue.
 
Are you currently using an A770?

I sucked it up a couple years ago and got a $600 4070. TBH it's a mostly good 1440p experience. And while it hurt to pay $600 for something that I'd consider more like a 4060 than a 4070, $600 for this level of performance would be a bargain in the current market :(
yes I am. It's working just fine. Might switch to Celesial or Druid, or even a B770, or a RTX 5060, but I'm not sure. It is working well, and with FG I'm playing everything at 360fps. My CPU -3700X- is more of a concern than the GPU.

On a different note, new Indiana Jones patch, with Full Path Tracing for AMD and Intel GPUs, XeSS! 😁DLSS4 and more RT improvements even when using Path Tracing.


Ray-Traced Local Lights
For players who have Path Tracing enabled, this update includes a new option for ray-traced shadows to include all light sources, which significantly increases the quality of shadows in interior locations. (Players can also choose to keep the previous setting, which provides ray-traced shadows from the sun only.)
 
Are you currently using an A770?

I sucked it up a couple years ago and got a $600 4070. TBH it's a mostly good 1440p experience. And while it hurt to pay $600 for something that I'd consider more like a 4060 than a 4070, $600 for this level of performance would be a bargain in the current market :(
the RTX 40XX series pertains to a good generation, all things considered.

do you plan to switch anytime soon, to a XTX 9070, B770, RTX 5060Ti etc, if that's not asking much? I think I can still get 2 decent years out of my A770 now that FG is a thing, but who knows.
 
the RTX 40XX series pertains to a good generation, all things considered.

do you plan to switch anytime soon, to a XTX 9070, B770, RTX 5060Ti etc, if that's not asking much? I think I can still get 2 decent years out of my A770 now that FG is a thing, but who knows.
Yes in hindsight the 4000 series doesn't look that bad. Especially the Super refreshes.

I've no plans to upgrade this generation. Even if MSRP were real, the only card that interests me is the 5070Ti and I wouldn't drop $750 for such an incremental upgrade over my 4070. The 4070 is pretty slick and operates on a very reasonable power budget. Would've made a very compelling $500 card, and when thinking about it in those terms, $100 more than that isn't much to stress about across the multiple years I plan to use it.
 
It’s great that KCD2 runs smoothly on moderate hardware but selfishly I wished they aimed higher on the graphics side. The forested areas look fantastic but the NPCs and man made structures are poor in comparison. It won’t age well.

I'm pretty much in the same camp. The forested areas look fantastic as long as there's a lot of grass to hide the ground textures and other ground detail. I think the main issue with the game is materials looking very flat. Wooden boards, cut stone, the dirt roads all look pretty bad by todays standards. The dense forest areas look really good though.
 

00:00:00 Introduction
00:01:15 Polished and "stutter-free"
00:07:26 Great Lighting and Environments
00:11:16 Nitpicks and why "Ultra" is just a name
00:19:27 Optimisation Tips
00:22:36 Conclusion
Great video, games that run well need to be celebrated. But also respect to DF for also showing the negative aspects, something other channels didn't all do. I really like that they call out the missing HDR support. HDR can make a big difference if implemented well and I wish more games were made with HDR in mind.
 

0:00:00 Introduction
0:01:04 News 1: GTA 5 slated for PC upgrade
0:15:32 News 2: Microsoft announces game generating AI
0:32:49 News 3: Cyberpunk 2 job listing suggests ultra realistic crowds
0:44:53 News 4: 5070 Ti launches to price hikes and scalping
1:06:35 News 5: Nvidia deprecates 32-bit PhysX on 50 series GPUs
1:19:16 Supporter Q1: Couldn’t frame extrapolation solve input lag for PC games?
1:24:20 Supporter Q2: Why are so many people underwhelmed with current-gen graphics?
1:36:32 Supporter Q3: If Microsoft is giving up on console market share, where does Sony’s primary competition come from in the future?
1:42:45 Supporter Q4: With FSR 4 delivering good results, why should Sony continue to develop PSSR?
1:47:53 Supporter Q5: How can we prevent digital licensing issues when PSN is offline?

Alex goes off on the Physx 32 bit boondogle:

@dictator said:
The reason why I find this a big problem on like a theoretical level, or just like an idealation one, is that - PC is the platform for backwards compatibility. And losing support, for just an arbitrary reason, presumably for bottom dollar reasons at the end of the day I think is a massive shame and a disservice to the platform, and a disservice to the customers who buy new GPU's - because one of the biggest joys of ever getting a new gpu, or a new set of hardware, is loading up your favourite old games and seeing them fly at ridiculous resolutions and ridiculous framerates. That is one of the joys of getting a new GPU. And Nvidia is just really taking a dump on that legacy. Taking a dump over the legacy of what Physx was back in the day.

@dictator said:
I just think this is a huge mistep from Nvidia, I'm going to bang this drum as loud as I can to either make this decision changed, or my nth degree goal here is that it shouldn't just be cards in the future cause now the thing is, these games are gonna look worse for every card in the future, from now on that is sold and eventually there's not gonna be any RTX 4000 series GPU's any more, and you're gonna have a game that either looks worse or runs way worse than it theoretically could for...really no good reason, and I think the best way around this would be to then actually open source the resources that make it possible to make this run on any GPU at this point in time. If it is so unprofitable and so unimportant that Nvidia can just deprecate it, why not do something good with that deprecation and give it to the community?"
 
Last edited:
Open sourcing is not a matter of transitioning into a mode for community project driven preservation of said technology but its about consent!

You may not have full owernship or the license to distribute contributions from third party authors without their authorization. Open sourcing doesn't just involve making your own code public but it may also involve making third party code visible to the public as well or their application specific interfaces/integrations that are external to the library itself ...
 
I think that at some point PSSR and FSR 4 will become almost the same thing, but Sony will still be calling it PSSR. It will just be Sony's implementation for playstation platforms. And that's not a bad thing, the only way that AMD and Sony can remain competitive is by collaborating.
 
I think that at some point PSSR and FSR 4 will become almost the same thing, but Sony will still be calling it PSSR. It will just be Sony's implementation for playstation platforms. And that's not a bad thing, the only way that AMD and Sony can remain competitive is by collaborating.
No ?

Current PSSR implementation relies on an entirely custom set of 44 3x3 convolution shader instructions and if Sony wants to impose a full BC requirement for next generation then there's potentially less room for evolution compared to the possibly more opaque black box nature FSR4 is looking up to be ?
 
Back
Top