But it is. RT is the core of how rendering is designed in the first place. There are so many challenges that can't be overcome due to rasterization. It looks ugly to me to see last-gen games using GI light probes -- even in the cutscenes without a local light source to hide no occlusion.
I agree in principle, but I do think a lot of the time people get so tied up in the marketing checkmarks or narratives they want to drive that they aren't even looking at the images they are seeing critically. I'll give a few recent examples.
There's games that use largely-diffuse, pretty low density, probe-based "RTGI" (as if that term means anything anymore). On the plus side, doing dynamic traces against triangle geometry is a massive win from a production perspective vs. baking. On the other hand, the visual result is often only marginally better than one would expect from baked probes, with rampant leaking and no fine-grained detail at all (or sometimes only via screen traces). Often the RT scene used in these implementations is so simplified that you're basically just getting very broad, diffuse bounce with no fine grained contact or specular occlusion (which, IMO, are the main hallmarks of GI).
Obviously you have to do something to make games fit into 60fps console budgets or whatever, and I think overall the tradeoffs made in these implementations are reasonable. But these implementations (Indy w/o "full RT", The Finals, Avatar, Metro, etc.) should not really be discussed as if they are competing with implementations that are aiming for much higher fidelity, fine-grained, specular GI (various "full RT"/"PT" things, Lumen both modes, etc) because the results are not really comparable. Even if using the reviled SDF path, I would fully expect Lumen SWRT to look quite a bit better than the stock Indy RTGI, albeit at a higher cost.
But that's the rub! The narrative often strays so far into "RT vs no RT" that people entirely lose the plot that there's a variety of solutions that sit in different places on the performance/quality spectrum. Even the critically important things like what does your tracing scene representation look like and at what frequency are you querying it and updating it are lost in the marketing and console wars. People start making arguments that are patently silly from a technical perspective because it supports whatever marketing narrative.
On the RT front I don't think the current tech landscape is actually that complicated. We want dynamic GI across the stack because the production burden of baking is increasingly unworkable with modern amounts of content. Therefore stuff like real-time probe diffuse GI makes sense even if it doesn't necessarily look much better than offline baking (which guess what guys... has always been raytracing...) would have. As we scale up quality and need sharper specular, low res BVHs or SDFs become insufficient, as do world space probes. This is honestly where the real cost is paid in terms of jumping to a solution that maintains a reasonably high quality BVH and higher frequency raytracing that can reasonably capture contact effects in addition to broad scale occlusion. These solutions are much more expensive than the other ones and thus basically require temporal and spatial resuse in various domains. But they are the only current way to get these effects at sufficient fidelity, which leads back to the original point.
Yes RT is a foundational technology that will be used more as we move up the quality stack. This has been true for decades, with only the timelines being a question mark. I'm glad we're getting to the point where we can start to assume that some hardware support for ray queries is present in games we ship, and thus I'm really happy to see Indy take a dependency on that as it opens the way for more progress once that becomes the norm. But we don't need to pretend that "RT" in the title of a technique automatically makes it look great or better than something without it because in reality, "RT" is as much a marketing term now as a technical one. Let's look at the actual images and compare on the phenomena they model and at what accuracy and detail levels, algorithms aside.
Most games don't exhibit polygonal edges anymore. Yes, the detail in HB2 is insane but it's still not implementing the lighting properly.
...
Indy Jones texture work is very high for the most part.
Let's not get silly, there's tons of polygonal edges in games, even Nanite ones. I don't really know what you mean by "lighting properly" because that's an arbitrary definition. What matters is how close the result is to a reference and I think Hellblade does relatively fine in that regard in the scope of its environments.
The texture point is one that I want to briefly mention though because it's one that I think really covers a lot of the amateur discussion of many of these games online. People in general respond to high resolution textures (and post-processing...) as if they are a primary thing that makes a game have "good graphics". Obviously super smudgy stuff isn't visually appealing, but on the other hand really high res textures and normal maps on low res geometry that doesn't interact with detailed lighting is last gen technology. See basically every "Skyrim with mods looks better than XYZ modern game!" post, but I'll even pick on a modern example.
Dev hell and all that aside - just judging the outcome at launch - I don't think Stalker 2 is a particularly good looking game, despite fairly modern technology; I would not put it on my best of 2024 even as a runner up. I think people generally just think it looks good because it has pretty high texture resolutions and occasionally good geometric/asset density, but in terms of the lighting, it really does not strike me as anything special in the majority of what I've played and seen. Obviously people are free to like high resolution textures and such if they want to but I think from an visual and lighting level, it's not really in the same league as other things on the list.
And to get on my personal soapbox again, I don't think stuff like the shadow quality of foliage in Stalker 2 or non-RT Indy, or to a slightly lesser extent Outlaws is acceptable in a 2024 game that gets put on a "best graphics of" list. (Obviously the ones that also have an RT path on PC get a pass assuming folks are speaking of those versions.) I don't really care if your textures and geometry are the size of pixels if your shadow map texels cover hundreds of pixels. I don't know why anyone else puts up with that either other than the fact that they got used to it. Honestly the biggest visual impact of the "full RT" paths often isn't even the GI parts, it's the fact that all lights get ray traced shadows. I guess this is at the very least a "light at the end of the tunnel" for shadow quality (pun intended
), but it does start to feel like people are using it as an excuse to not even do a decent shadow map implementation in the mean time.
Anyways take all this with a grain of salt. I think the DF list is broadly reasonable and while no two people will ever make exactly the same list, it's always just a question of what ones priorities are. It's great to see a bunch of bespoke renderers and techniques like Tiny Glade (and PoE2 is probably another good candidate) as well. While these rendering techniques don't necessary have broad application, IMO *they* are the real cases that strongly show places where developing your own rendering tech specific to your game constraints (whether within the umbrella of a game engine or otherwise) still has an important place in 2024 and beyond.