Digital Foundry Article Technical Discussion [2025]

Was there any uplift on consoles between beta and release? If so maybe the the pc version will also benefit. I know I saw people on all platforms during the beta talking about how poorly it ran and looks.
The benchmark should use the final version of the game I assume.
 
The benchmark should use the final version of the game I assume.
Can I be cheeky and say not even the version on release day is the final version ;)

Joking aside logically that would make sense, only a couple of days to see just how screwed up the pc version is.
 
You need 16 teraflops to reach a dynamic (?) 1080p at 60fps without upscaling. What even is this engine right now. REX engine better be a huge overhaul.
 
Japanese devs are technologically further behind western devs than fsr is to DLSS. It’s just taboo to call it out.
Monster Hunter Wilds is the worst showing yet IMO. I've played mostly Japanese games lately and the tech ranges from passably mediocre to horrible. MHW is beyond horrible.

I wait on some patches for games like this. By the time I got around to Dragon's Dogma 2, it had entered "passably mediocre" territory. Basically my 4070/13600K was enough to brute force through the remaining issues or cover them up with framegen.
 
With no RT there, with no continuous LOD, poor indirect lighting and all in an arid and empty desert (one of the easiest biomes to render in real-time rendering). I really have no idea what is wrong.
This engine simply doesn't seem fit for huge environments. It was the same story for DD2.

And yeah, as much as I love games from Japanese devs, they're often technically very underwhelming. Elden Ring, Monster Hunter Wilds, Dragon's Dogma 2, and many more.
 
Japanese devs are still ok at game design, but I agree they are rather poor when it comes to the execution of technical aspects. From graphics tech and optimization to things like netcode and gameplay reliability. They are always at the bottom of the industry.
 
With no RT there, with no continuous LOD, poor indirect lighting and all in an arid and empty desert (one of the easiest biomes to render in real-time rendering). I really have no idea what is wrong.
The RE engine seems to have been designed around corridors and closed areas, it has high CPU limitations in open world sections, and with many actors on the screen. It also has a botched screen space reflections support, and even ray traced reflections are weak (unless path tracing is forced).
 
image0.png


Oh yay, saw this posted elsewhere.. Another great optimized PC game to look forward to! As if Frame Generation being the first thing the game asks you to enable wasn't enough of a hint at the incoming shit being shoveled our way.
 
The RE engine seems to have been designed around corridors and closed areas, it has high CPU limitations in open world sections, and with many actors on the screen. It also has a botched screen space reflections support, and even ray traced reflections are weak (unless path tracing is forced).
But the game isn't really CPU limited on the consoles. It's the GPU side that makes no sense.

The only thing that I can think of is that they are always rendering the monsters and the fauna even when you aren't looking, combined with a geometry culling system that's not working how it should.
 
CPU version, which is junk. GPU PhysX is proprietary and locked up.


In the documentation it says that you can use CUDA to accelerate physics simulation.


But being open-source doesn't make a difference, because the removal is at the hardware level, Blackwell's Streaming Multiprocessor no longer has 32-bit CUDA instructions. So much so that other applications also lost performance, such as Passmark:


And this deprecation is not new or silent as reported. It has been happening since 2017 (2013 on Linux), in waves. With each new CUDA, some feature was removed. When you tried to compile something in 32-bit CUDA, you received the deprecation warning.

Support for developing and running 32-bit CUDA and OpenCL applications on x86Linux platforms is deprecated.
32-bit application developer tools for Windows is deprecated and will no longer be supported in the next release of CUDA. As of CUDA 9.2, 32-bit support for the following tools on Windows (and associated files) is still available but will be removed in the next release:

Ideally, developers would recompile Physx for games for the 64-bit version, but that's not going to happen. It also doesn't make sense for Nvidia to maintain compatibility with 32-bit applications, since they've been out of use for at least 10 years. There's no longer a 32-bit Windows or Linux (there are some distros, but they're rare). In 2038, these 32-bit applications will have other problems.
 
In terms of a broader discussion on this issue I know a person posted a discussion about what happens to "raster" gaming in a future of GPUs designed of path tracing and this along with that does bring up an interesting issue in terms of the long term back compatibility of PC gaming and what expectations should be. With the slow down in hardware improvements there likely is going to be some sort of point in which we can't work around via software (such as emulation) solutions either just due to the lack of performance capability.

Or maybe that idea from Microsoft that generative AI preserving back compatibility with something that might be an analogue in terms of the user experience but not an actual copy of the game has some merit rooted in practicality in the future.
 
Mod: Hey all. Please wait until videos go public before posting. No rules are breached here, but just cleaning it up for their sake.

Thanks,
 
I think there's enough sources out there now that Digital Foundry should look specifically at DirectStorage with GPU decompression and call out how sporadic frametimes are when simply turning the camera. This is greatly exacerbated with kb/m controls. You can tank the framerates by just looking around quickly in games like Ratchet and Clank, Spider-Man 2, and a couple others I know of but am forgetting offhand. It's terrible.

Nvidia specifically seems to be affected more disproportionately compared to AMD, for whichever reasons.. but its common enough now with these games that use GPU decompression that it doesn't seem like a developer specific implementation issue.

I think devs should stick to CPU DirectStorage support until this is either solved or there's some new solution (neural decomp I guess)
 

1:24:20 Supporter Q2: Why are so many people underwhelmed with current-gen graphics?
I'm not sure I fully agree with John's argument about UC4 and static environments, especially as they are inserting clips of a jeep plowing through dozens of movable physics objects while he's explaining it. Dynamic environments and objects don't need RT. RT just allows you to have movable objects with closer to baked lighting quality.

There are games today and in the past that have quite a bit of interactivity without RT. Astro Bot and TOTK are two modern examples. TOTK even has a crude system for dynamic diffuse GI and reflections on very low-end hardware. Fortnite is maybe the best example of what he's saying, and it's dynamicism is likely why Epic moved in the direction of Luman. But games like Fortnite are the exception.

Imo, I think the reason people are unimpressed is because realtime RT lighting brings with it heavy processing requirements, noise, and low resolutions on common hardware, all for marginal visual gains if you already had good baked lighting. Plus, many games have no real increase in interactivity. (Silent Hill 2 Remake comes to mind) Basically, many games are filled with new artifacts that weren't there before, and many people often couldn't see the artifacts in the old rasterized way in the first place; especially if they are comparing the best rasterized examples from the past.

Of course RT is the future, and has been the future since programmable shading came out in 2000, but perhaps it's been pushed too hard when most people really don't have the raw GPU power to make it look good yet. Moore's Law slowing to a crawl just makes that worse.
 
Back
Top