Digital Foundry Article Technical Discussion [2025]

But there are different types of precomputed lighting that give varying degrees of interactivity, from lightmaps to irradiance volumes to PRT. There's also methods of realtime lighting that are not hardware RT like SVOGI, real time cubemaps, etc. These all have their own drawbacks, but it's not like updating a BVH with lots of dynamic objects is particularly viable right now either. The point is, there's lighter weight options that still can look "good" and not require a game to be noisy and run at 720p internal.

The reality is those techniques don’t scale and don’t produce convincing results otherwise they would be more widespread. Why don’t we have tons of games with non-raytraced GI today? Instead we got screen space AO, screen space reflections and terrible aliased shadow maps cast by a handful of lights at best. RT didn’t cause the problem, it’s just the best current solution to problems that have existed for a long time.

As we see in KCD2 it is possible to ship games that run well if you don’t push the envelope on visuals. But it is a trade off.
 
Ah, the Capcom DRaMa is back....

YouygT.png


YoDjHq.png
 
If you you think some developer making use of a said technology are the problem
No, the problem is the developer not optimizing his game, textures look like ass even on a PS5.


If you are talking about DirectStorage, several games got released with it and ran fine already, NVIDIA even has RTX-IO versions in titles like Portal RTX, Portal RTX Prelude and the upcoming Half Life 2 RTX and all run fine. However, already broken games don't run fine because they are broken.
 
No, the problem is the developer not optimizing his game, textures look like ass even on a PS5.


If you are talking about DirectStorage, several games got released with it and ran fine already, NVIDIA even has RTX-IO versions in titles like Portal RTX, Portal RTX Prelude and the upcoming Half Life 2 RTX and all run fine. However, already broken games don't run fine because they are broken.
I wish those YouTube titles said "don't buy this game on PC AND CONSOLES". Some of the textures he is showing in that video can only be explained by the absence of a texture streaming system in this engine.

Knowing what the engine is, how the hell do you even decide to just go ahead with it and make another mainline monster hunter with it?

It's madness. I'll get the game tomorrow, since I got it physical and there was a delay, and I guess I'll see the damage for myself.
 
Some of the textures he is showing in that video can only be explained by the absence of a texture streaming system in this engine.

Actually this game do have texture streaming, even quite aggressively. The evidence is that on my old PC when running this game sometimes the NVMe SSD got disconnected, then the whole scene becomes black gradually. You can even see how new textures failed to be loaded and becomes just black triangles. If there's no texture streaming it wouldn't happen at all.
 
Actually this game do have texture streaming, even quite aggressively. The evidence is that on my old PC when running this game sometimes the NVMe SSD got disconnected, then the whole scene becomes black gradually. You can even see how new textures failed to be loaded and becomes just black triangles. If there's no texture streaming it wouldn't happen at all.
If it has, then it's a really bad system. Sadly that's probably even worse 😅
 
720p internal is not usually caused by ray tracing, 90% of the time it's caused by raster games. Like the recently released Monster Hunter Worlds. There is also A Plague Tale: Requiem, Gotham Knights, Star Wars Jedi Survivor and Forspoken (Performance mode doesn't support ray tracing yet runs at 720p internal), or the usual UE5 suspects, like Immortals of Aveum, Avowed, Black Myth Wukong .. etc.
There's definitely some unusually heavy rasterized games coming out, like AW2 and MH Wilds. It's a bit of a headscratcher as to why these games are so GPU heavy. However, I don't think that's the majority. Most of your examples don't even support that. A Plague Tale is 1080p in perf mode on SX/PS5. Gotham Knights uses RT reflections and is close to 2160p (kind of an outlier honestly). Jedi was ~720p when RT was on for the perf mode, but since they removed RT for that mode, its ~1080p most of the time. Forspoken is also ~1080p in perf mode since the latest patch.

UE5 games are another beast. We know Nanite is not the most efficient rasterizer, but it can handle large amounts of small geometry better than a hardware rasterizer can. Doesn't mean rendering all that geo comes cheap though. Lumen is RT, just not hardware RT. These techs (along with VSMs) substantially increase per-pixel processing costs, so I understand why UE5 games are often low res. UE5 in general is an example of what I'm talking about. Games using its headline features often seem to overshoot what the consoles and mainstream PCs can realistically do at acceptable performance and IQ. I have hope games using newer versions of the engine will be more performant (although not perfect, Avowed is a step in the right direction imo).
The reality is those techniques don’t scale and don’t produce convincing results otherwise they would be more widespread. Why don’t we have tons of games with non-raytraced GI today? Instead we got screen space AO, screen space reflections and terrible aliased shadow maps cast by a handful of lights at best. RT didn’t cause the problem, it’s just the best current solution to problems that have existed for a long time.

As we see in KCD2 it is possible to ship games that run well if you don’t push the envelope on visuals. But it is a trade off.
Plenty of games of all scopes have shipped with non-RT GI techniques (from fully baked to fully real time), and on less powerful hardware then we have today. They don't always look the best, but they are often performant. Don't get me wrong, I've been eagerly awaiting real time RT for over 15 yrs, but most hardware isn't ready for everything RT can bring yet, even if we can get a taste of that future with high-end NV hardware today. People are prioritizing performance more and more (see perf modes on console games), and sometimes those older techniques are still the right answer. Other times, some form of RT fits well and is still performant (see The Finals, Indiana Jones, & Spiderman 2). KCD2 used SVOGI instead of RT GI (likely mostly due to the limitation of CryEngine), but it still looked good and ran well on a range of hardware. It was praised for its performance. Smart trade offs like that are what I'm asking for.
 
Last edited:
There's definitely some unusually heavy rasterized games coming out, like AW2 and MH Wilds. It's a bit of a headscratcher as to why these games are so GPU heavy. However, I don't think that's the majority. Most of your examples don't even support that. A Plague Tale is 1080p in perf mode on SX/PS5. Gotham Knights uses RT reflections and is close to 2160p (kind of an outlier honestly). Jedi was ~720p when RT was on for the perf mode, but since they removed RT for that mode, its ~1080p most of the time. Forspoken is also ~1080p in perf mode since the latest patch.

UE5 games are another beast. We know Nanite is not the most efficient rasterizer, but it can handle large amounts of small geometry better than a hardware rasterizer can. Doesn't mean rendering all that geo comes cheap though. Lumen is RT, just not hardware RT. These techs (along with VSMs) substantially increase per-pixel processing costs, so I understand why UE5 games are often low res. UE5 in general is an example of what I'm talking about. Games using its headline features often seem to overshoot what the consoles and mainstream PCs can realistically do at acceptable performance and IQ. I have hope games using newer versions of the engine will be more performant (although not perfect, Avowed is a step in the right direction imo).

Plenty of games of all scopes have shipped with non-RT GI techniques (from fully baked to fully real time), and on less powerful hardware then we have today. They don't always look the best, but they are often performant. Don't get me wrong, I've been eagerly awaiting real time RT for over 15 yrs, but most hardware isn't ready for everything RT can bring yet, even if we can get a taste of that future with high-end NV hardware today. People are prioritizing performance more and more (see perf modes on console games), and sometimes those older techniques are still the right answer. Other times, some form of RT fits well and is still performant (see The Finals, Indiana Jones, & Spiderman 2). KCD2 used SVOGI instead of RT GI (likely mostly due to the limitation of CryEngine), but it still looked good and ran well on a range of hardware. It was praised for its performance. Smart trade offs like that are what I'm asking for.

From your post I think what you want is simply for games to be more optimized. It's really not a problem of RT (or any specific technology). This does not really contradict with what said in the DF video.

The point made in the video is that rendering technologies will go forward, and to go forward sometimes we need to go backward a bit. This is always the case (such as the 2D to 3D migration). This is esepcially more apparent when the technology is not just an evolutionary one. For example, to use RT GI one has to give up baked light maps. This means you can have dynamic global lightings (such as a moving sun), on the other hand, you can't really have a baked lighting fallback because now the sun is moving. So you either have to use other expensive methods or you have no GI at all. This is what was said about "going backfward."

Some of the RT are more evolutionary in nature, such as reflection, as one can easily replace a RT reflection with faked ones without too much problem (because, without RT, it's faked anyway). Shadows are a bit harder but one can just use shadow maps with the most prominent light source, which is of course worse but not catastrophic. RT lighting, on the other hand, is quite different because with traditional lighting artists generally have to put some fake lights to make a scene looks natural, but this is not just unnecessary but even undesirable when real RT lighting is used. Now if a game wants to use RT lighting because it looks better and don't need the extra fake light sources, it'd be pointless to have a fallback which needs fake light sources again.

Of course one can argue that the time is still not ripe for such "revolution." That could be true, but I think we need to start somewhere, and as I said it's been 6 years and it's about time we should get it started. I think it's important to remember that if games don't go foward, GPU won't, because no one would be buying expensive GPU to play games when a cheap one does the trick. It'd be unfortunate for gamers if all GPU vendors think the GPU market is boring and AI is more profitable anyway.
 
Back
Top