There's definitely some unusually heavy rasterized games coming out, like AW2 and MH Wilds. It's a bit of a headscratcher as to why these games are so GPU heavy. However, I don't think that's the majority. Most of your examples don't even support that. A Plague Tale is 1080p in perf mode on SX/PS5. Gotham Knights uses RT reflections and is close to 2160p (kind of an outlier honestly). Jedi was ~720p when RT was on for the perf mode, but since they removed RT for that mode, its ~1080p most of the time. Forspoken is also ~1080p in perf mode since the latest patch.
UE5 games are another beast. We know Nanite is not the most efficient rasterizer, but it can handle large amounts of small geometry better than a hardware rasterizer can. Doesn't mean rendering all that geo comes cheap though. Lumen is RT, just not hardware RT. These techs (along with VSMs) substantially increase per-pixel processing costs, so I understand why UE5 games are often low res. UE5 in general is an example of what I'm talking about. Games using its headline features often seem to overshoot what the consoles and mainstream PCs can realistically do at acceptable performance and IQ. I have hope games using newer versions of the engine will be more performant (although not perfect, Avowed is a step in the right direction imo).
Plenty of games of all scopes have shipped with non-RT GI techniques (from fully baked to fully real time), and on less powerful hardware then we have today. They don't always look the best, but they are often performant. Don't get me wrong, I've been eagerly awaiting real time RT for over 15 yrs, but most hardware isn't ready for everything RT can bring yet, even if we can get a taste of that future with high-end NV hardware today. People are prioritizing performance more and more (see perf modes on console games), and sometimes those older techniques are still the right answer. Other times, some form of RT fits well and is still performant (see The Finals, Indiana Jones, & Spiderman 2). KCD2 used SVOGI instead of RT GI (likely mostly due to the limitation of CryEngine), but it still looked good and ran well on a range of hardware. It was praised for its performance. Smart trade offs like that are what I'm asking for.