Here's another video for the weird ray tracing-texture problem. This time it is more apparent and clear.
I just wish "vram consumption bar" was a standard among these games. How am I to know what are my limits from the perspective of engine? Yes, turning off ray tracing in this instance brings back textures. But what guarantee there is that it won't happen in another scene? Because even without ray tracing, I still get unloaded textures even at native 1080p. It's just weird to me.
As long as the Series S exists, devs will provide scalable texture settings that actually look good during the generation.
But as for PS5 exclusive games, you are right there is a chance developers won't care about users with 8GB and less. Hopefully Forspoken is the exception and not the rule. We will see how Returnal will run, I'm quite hopeful about that one.
If the Forspoken way becomes a trend however, I surely hope it gets called out. Most have 8 and 6 GB cards. It'd be awful if a game would look much worse on a 3070 compared to the PS5.
What if Forspoken way ends up being the norm for S too? That would be funny. The userbase of Series S seems to adapt to everything, they have no trouble playing games upscaled to 1080p via regular temporal upscaling from 700 to 900p. Meanwhile on PC, a majority of folks find FSR 2 unusable at 1080p and even DLSS questionable at 1080p. Maybe it has to do with standards, but clearly S people have lower standards (nothing wrong with it). I:'m talking about the usual regular temporal upscaling.
I wonder how will Forspoken end up working on Xbox S then. The game has 2 year exclusivity. Maybe within that time span, they will provide some extra texture settings or something. I mean they should've.
I agree with you. In this case, problem spans to 3080 too. Or 3080ti and 4070ti at 1440p and above. I'm sure with 4070ti at 1440p with ray tracing, similar problems would ensue. So it is simply not a good look for the game and hardware in general.
What actually angers me is that this weird behaviour happens when the game maxes out at a weird 6.4-6.6 GB VRAM usage. I really feel like 8 GB is getting gimped and borked big time here. Not only it was an amount that barely did the work, now they also put artificial caps to further kneecap it.
Somehow Cyberpunk had the ability to use the all available VRAM (conveniently) when the highest end NVIDIA cards at the time were mostly around 8-10 GB (aside from the 3090). I remember most 2020-2021 ray tracing games using all available VRAM budget to fit ray tracing and good quality texture togethers.
All of a sudden in 2022, 2023, we get this behaviour with Spiderman and Forspoken. It just seems a fishy a bit too. I don't know what to say. Its just... feels wrong. The resource is there. Why not use it? Do they expect me to stream, open 10 tabs of Chrome and a second screen with Twitch on it all the time or something?