I wouldn't say redeemed but more like on life support. Sure it is fixed for now but for how long the treatment will continue to what extent?
After all, 2 GB GTX 770 ran Rise of Tomb Raider and Arkham Knight with great visuals with its tiny 2 GB buffer. With games after 2018; have to reduce textures to a point games look really worse than these two games I've listed due to abhorrent textures. (RDR2 for example looks visually worse than Rise of the Tomb Raider with low textures. Yes, it has better GI, shadows etc. but clearly a better "texturing" solution could've been procured. But at that point, who cares for 2 GB GPUs anyways? Sooner or later, if enough people move on from 8 gig cards, devs will stop doing these "patches". It just proves that capability and ability and doability is there. It just depends on the user outrage. (Though in the case of Forspoken, ship has sailed already. However if it indeed fixed the problem, I could consider getting the game on %50 sale)
The VRAM usage on recent games is very mysterious. Individual textures don't look appreciably higher resolution. Is it that there are just many more unique textures nowadays and games are relying less on tiling/repetition? I've been struggling with out of memory crashes on a 8GB 3050 when running Injustice 2 at 4K. There's no way on god's green earth a fighting game should be using that much VRAM.