Digital Foundry Article Technical Discussion [2025]

But there are different types of precomputed lighting that give varying degrees of interactivity, from lightmaps to irradiance volumes to PRT. There's also methods of realtime lighting that are not hardware RT like SVOGI, real time cubemaps, etc. These all have their own drawbacks, but it's not like updating a BVH with lots of dynamic objects is particularly viable right now either. The point is, there's lighter weight options that still can look "good" and not require a game to be noisy and run at 720p internal.

The reality is those techniques don’t scale and don’t produce convincing results otherwise they would be more widespread. Why don’t we have tons of games with non-raytraced GI today? Instead we got screen space AO, screen space reflections and terrible aliased shadow maps cast by a handful of lights at best. RT didn’t cause the problem, it’s just the best current solution to problems that have existed for a long time.

As we see in KCD2 it is possible to ship games that run well if you don’t push the envelope on visuals. But it is a trade off.
 
Ah, the Capcom DRaMa is back....

YouygT.png


YoDjHq.png
 
If you you think some developer making use of a said technology are the problem
No, the problem is the developer not optimizing his game, textures look like ass even on a PS5.


If you are talking about DirectStorage, several games got released with it and ran fine already, NVIDIA even has RTX-IO versions in titles like Portal RTX, Portal RTX Prelude and the upcoming Half Life 2 RTX and all run fine. However, already broken games don't run fine because they are broken.
 
No, the problem is the developer not optimizing his game, textures look like ass even on a PS5.


If you are talking about DirectStorage, several games got released with it and ran fine already, NVIDIA even has RTX-IO versions in titles like Portal RTX, Portal RTX Prelude and the upcoming Half Life 2 RTX and all run fine. However, already broken games don't run fine because they are broken.
I wish those YouTube titles said "don't buy this game on PC AND CONSOLES". Some of the textures he is showing in that video can only be explained by the absence of a texture streaming system in this engine.

Knowing what the engine is, how the hell do you even decide to just go ahead with it and make another mainline monster hunter with it?

It's madness. I'll get the game tomorrow, since I got it physical and there was a delay, and I guess I'll see the damage for myself.
 
Some of the textures he is showing in that video can only be explained by the absence of a texture streaming system in this engine.

Actually this game do have texture streaming, even quite aggressively. The evidence is that on my old PC when running this game sometimes the NVMe SSD got disconnected, then the whole scene becomes black gradually. You can even see how new textures failed to be loaded and becomes just black triangles. If there's no texture streaming it wouldn't happen at all.
 
Actually this game do have texture streaming, even quite aggressively. The evidence is that on my old PC when running this game sometimes the NVMe SSD got disconnected, then the whole scene becomes black gradually. You can even see how new textures failed to be loaded and becomes just black triangles. If there's no texture streaming it wouldn't happen at all.
If it has, then it's a really bad system. Sadly that's probably even worse 😅
 
720p internal is not usually caused by ray tracing, 90% of the time it's caused by raster games. Like the recently released Monster Hunter Worlds. There is also A Plague Tale: Requiem, Gotham Knights, Star Wars Jedi Survivor and Forspoken (Performance mode doesn't support ray tracing yet runs at 720p internal), or the usual UE5 suspects, like Immortals of Aveum, Avowed, Black Myth Wukong .. etc.
There's definitely some unusually heavy rasterized games coming out, like AW2 and MH Wilds. It's a bit of a headscratcher as to why these games are so GPU heavy. However, I don't think that's the majority. Most of your examples don't even support that. A Plague Tale is 1080p in perf mode on SX/PS5. Gotham Knights uses RT reflections and is close to 2160p (kind of an outlier honestly). Jedi was ~720p when RT was on for the perf mode, but since they removed RT for that mode, its ~1080p most of the time. Forspoken is also ~1080p in perf mode since the latest patch.

UE5 games are another beast. We know Nanite is not the most efficient rasterizer, but it can handle large amounts of small geometry better than a hardware rasterizer can. Doesn't mean rendering all that geo comes cheap though. Lumen is RT, just not hardware RT. These techs (along with VSMs) substantially increase per-pixel processing costs, so I understand why UE5 games are often low res. UE5 in general is an example of what I'm talking about. Games using its headline features often seem to overshoot what the consoles and mainstream PCs can realistically do at acceptable performance and IQ. I have hope games using newer versions of the engine will be more performant (although not perfect, Avowed is a step in the right direction imo).
The reality is those techniques don’t scale and don’t produce convincing results otherwise they would be more widespread. Why don’t we have tons of games with non-raytraced GI today? Instead we got screen space AO, screen space reflections and terrible aliased shadow maps cast by a handful of lights at best. RT didn’t cause the problem, it’s just the best current solution to problems that have existed for a long time.

As we see in KCD2 it is possible to ship games that run well if you don’t push the envelope on visuals. But it is a trade off.
Plenty of games of all scopes have shipped with non-RT GI techniques (from fully baked to fully real time), and on less powerful hardware then we have today. They don't always look the best, but they are often performant. Don't get me wrong, I've been eagerly awaiting real time RT for over 15 yrs, but most hardware isn't ready for everything RT can bring yet, even if we can get a taste of that future with high-end NV hardware today. People are prioritizing performance more and more (see perf modes on console games), and sometimes those older techniques are still the right answer. Other times, some form of RT fits well and is still performant (see The Finals, Indiana Jones, & Spiderman 2). KCD2 used SVOGI instead of RT GI (likely mostly due to the limitation of CryEngine), but it still looked good and ran well on a range of hardware. It was praised for its performance. Smart trade offs like that are what I'm asking for.
 
Last edited:
There's definitely some unusually heavy rasterized games coming out, like AW2 and MH Wilds. It's a bit of a headscratcher as to why these games are so GPU heavy. However, I don't think that's the majority. Most of your examples don't even support that. A Plague Tale is 1080p in perf mode on SX/PS5. Gotham Knights uses RT reflections and is close to 2160p (kind of an outlier honestly). Jedi was ~720p when RT was on for the perf mode, but since they removed RT for that mode, its ~1080p most of the time. Forspoken is also ~1080p in perf mode since the latest patch.

UE5 games are another beast. We know Nanite is not the most efficient rasterizer, but it can handle large amounts of small geometry better than a hardware rasterizer can. Doesn't mean rendering all that geo comes cheap though. Lumen is RT, just not hardware RT. These techs (along with VSMs) substantially increase per-pixel processing costs, so I understand why UE5 games are often low res. UE5 in general is an example of what I'm talking about. Games using its headline features often seem to overshoot what the consoles and mainstream PCs can realistically do at acceptable performance and IQ. I have hope games using newer versions of the engine will be more performant (although not perfect, Avowed is a step in the right direction imo).

Plenty of games of all scopes have shipped with non-RT GI techniques (from fully baked to fully real time), and on less powerful hardware then we have today. They don't always look the best, but they are often performant. Don't get me wrong, I've been eagerly awaiting real time RT for over 15 yrs, but most hardware isn't ready for everything RT can bring yet, even if we can get a taste of that future with high-end NV hardware today. People are prioritizing performance more and more (see perf modes on console games), and sometimes those older techniques are still the right answer. Other times, some form of RT fits well and is still performant (see The Finals, Indiana Jones, & Spiderman 2). KCD2 used SVOGI instead of RT GI (likely mostly due to the limitation of CryEngine), but it still looked good and ran well on a range of hardware. It was praised for its performance. Smart trade offs like that are what I'm asking for.

From your post I think what you want is simply for games to be more optimized. It's really not a problem of RT (or any specific technology). This does not really contradict with what said in the DF video.

The point made in the video is that rendering technologies will go forward, and to go forward sometimes we need to go backward a bit. This is always the case (such as the 2D to 3D migration). This is esepcially more apparent when the technology is not just an evolutionary one. For example, to use RT GI one has to give up baked light maps. This means you can have dynamic global lightings (such as a moving sun), on the other hand, you can't really have a baked lighting fallback because now the sun is moving. So you either have to use other expensive methods or you have no GI at all. This is what was said about "going backfward."

Some of the RT are more evolutionary in nature, such as reflection, as one can easily replace a RT reflection with faked ones without too much problem (because, without RT, it's faked anyway). Shadows are a bit harder but one can just use shadow maps with the most prominent light source, which is of course worse but not catastrophic. RT lighting, on the other hand, is quite different because with traditional lighting artists generally have to put some fake lights to make a scene looks natural, but this is not just unnecessary but even undesirable when real RT lighting is used. Now if a game wants to use RT lighting because it looks better and don't need the extra fake light sources, it'd be pointless to have a fallback which needs fake light sources again.

Of course one can argue that the time is still not ripe for such "revolution." That could be true, but I think we need to start somewhere, and as I said it's been 6 years and it's about time we should get it started. I think it's important to remember that if games don't go foward, GPU won't, because no one would be buying expensive GPU to play games when a cheap one does the trick. It'd be unfortunate for gamers if all GPU vendors think the GPU market is boring and AI is more profitable anyway.
 
I have tried the disc version of wilds on PS5, and compared to the beta, it's much better. The framerate in performance is pretty stable and the image quality is ok at tv viewing distances.

That is unless you get in an area with foliage. There it becomes pixel soup. It's so bad. For the image quality that it presents, you would think that it was running two ray traced effects on console, but you can't see them.

The game is enjoyable nonetheless, but Capcom is playing with fire releasing such a big game in this state.
 
Of course one can argue that the time is still not ripe for such "revolution."

In order to make that argument you would need to offer examples of advanced graphics that don't rely on tracing techniques. All of the impressive looking non-raytraced games I can think of rely heavily on baking. I suppose you can also argue that we don't need better graphics, just better games but I don't think that opinion will find much sympathy in the market.

That could be true, but I think we need to start somewhere, and as I said it's been 6 years and it's about time we should get it started.

I have seen folks suggest that RT in games should be introduced after hardware is fast at RT. Chicken, meet egg.
 

DF Direct Weekly #203: AMD RDNA 4 Reveal, Fable Delayed To 2026, Forza Horizon PS5, Pokémon Legends​


0:00:00 Introduction
0:00:46 News 1: AMD RDNA 4 graphics cards unveiled
0:24:17 News 2: Fable delayed to 2026
0:33:38 News 3: La Quimera impresses in debut trailer
0:41:28 News 4: Sonic Racing CrossWorlds beta tested!
0:49:39 News 5: Terminator 2D: No Fate revealed
0:56:15 Supporter Q1: What do you make of Forza Horizon 5’s modes on PS5 and PS5 Pro?
1:00:51 Supporter Q2: Was Pokémon Legends: Z-A actually exhibited running on Switch 2?
1:07:28 Supporter Q3: Will we see FSR 4 in handhelds soon?
1:11:35 Supporter Q4: Could you do a video comparing top-end PC experiences versus consoles?
1:23:31 Supporter Q5: Why have PS5 Pro sales dropped behind PS4 Pro?
1:34:07 Supporter Q6: Could a Steam Deck 2 release in 2025?
1:42:15 Supporter Q7: Should prospective PC gamers just buy a PS5 Pro, given the high price of new GPUs?
 
Huh? There's more to PCs and more to PC gaming than the latest high fidelity SP games.

Vast majority of PCs sold are sans discrete GPUs.
 
They can have their indies or all of the smalltime projects as much as they want but those systems will clearly NEVER reach a satisfactory point for the traditional bigtime project publishers ...
 
Last edited:
Four years ago, Metro Exodus Enhanced Edition proved that a game with ray traced global illumination could run at 60 FPS on console, even Series S. Recently, Indiana Jones demonstrated it again. Doom: TDA will do so yet again, and there's a good chance it will get a Switch 2 port that also runs at 60 FPS. Since ray traced GI can scale down to the weakest non-mobile hardware still relevant today, there's little reason for developers to invest in non-ray traced real-time global illumination. Software Lumen was a good solution when UE5 launched, and we'll continue to see games release with it until the current console generation ends, but it's clear that HW Lumen and MegaLights represent the future of lighting in Unreal. SVOGI was a good solution for hardware that had the compute power to run it but lacked RT support, but now all relevant hardware supports RT.

KCD2 uses SVOGI instead of RTGI because CryEngine supports SVOGI but not RTGI, not because Warhorse Studios believes SVOGI is technically superior to RTGI. There's nothing wrong with using features the engine already supports instead of spending man-hours modifying a licensed engine, especially if there's still a significant number of fans in the Czech Republic running Pascal or older cards. But claiming that KCD2 proves RT shouldn't be used just doesn't make sense. RTGI can scale down to deliver the same or better visuals at the same performance cost as SVOGI, and also scale up to deliver significantly better visuals at a higher performance cost.
 
In order to make that argument you would need to offer examples of advanced graphics that don't rely on tracing techniques. All of the impressive looking non-raytraced games I can think of rely heavily on baking. I suppose you can also argue that we don't need better graphics, just better games but I don't think that opinion will find much sympathy in the market.
Is that because of specific genres/styles having expectations or something? I ask because there are plenty of super-popular indies that have a rudimentary presentation but no-one cares. Why can Vampire Survivors get away with looking like arse where a high-tier narrative game has to look gorgeous? Why can a new horror game get away with PS1 graphics depending on who's producing it?
 
Is that because of specific genres/styles having expectations or something? I ask because there are plenty of super-popular indies that have a rudimentary presentation but no-one cares. Why can Vampire Survivors get away with looking like arse where a high-tier narrative game has to look gorgeous? Why can a new horror game get away with PS1 graphics depending on who's producing it?

Part of it is transactional. $5 games like Vampire Survivors obviously have a much lower bar to meet. Then there are expectations based on genre or simply what has gone before. People complain when the latest Assassins creed has similar or "downgraded" graphics in their eyes compared to the prior entry even though the graphics are still objectively very good. If graphics stagnates at the $70 tier the reception of those blockbuster titles will suffer.

I just finished Life is Strange a very old UE3 game and enjoyed it immensely. Partly because the price of entry was very low and partly because graphics wouldn't have significantly improved my enjoyment of a very narrative focused experience. Prettier graphics would've been nice for sure but not necessary.
 
Advancing through wilds, the PS3 level textures aren't really that rare. They are everywhere. On monsters, on npc's clothing, on the environment... This game has the worse texture quality of any game I have seen on PS5. I struggle to remember any game on PS4 that was worse.
 
They can have their indies or all of the smalltime projects as much as they want but those systems will clearly NEVER reach a satisfactory point for the traditional bigtime project publishers ...
The APUs in XBOX and PlayStation and Switch and SteamDeck and the SteamDeck's contemporaries would like a word with you. And of course we're hand-waving off the immense pile of X86 PCs which aren't intended or ever used for for "gaming", which by the way, is the overwhelming supermajority of devices ever sold and continuing to be sold to this day. Business laptops and desktops, cash registers, servers that aren't meant for AI (which is the grand majority of all servers sold), the list is at least a kilometer long in 2pt font.

No, integrated graphics are not going away at any point in the hypothetical future.
 
Back
Top