D
Deleted member 86764
Guest
I'm not sure I agree with this. I think the argument was that the console IO systems have brought load times down so much that they are essentially good enough forever now (this doesn't apply to streaming of course). So for load times, even a PC 4x faster is arguably not really any better because the difference between a 2 second load and a 0.5 second load is nothing.
The GPU argument is entirely different though. While some may argue that the difference between 1440p and 4k is meaningless, few would argue the same about 30fps and 60fps. Which is what that 2x performance can give you. Or 2x the performance can bring a lot in terms of core graphics if you equalise resolution and framerate which seems to be the direction many devs are taking with PC right now. That's before we consider VR where we're every ounce of additional power counts for a lot. And of course that's only the GPU generation that's contemporary with these consoles, what about the one after that, or after that. Those benefits will keep growing over time unlike faster IO systems which even with multiple times the speed would be giving extremely marginal benefits.
Hey, for those of us that only switch once over the course of a generation - a 2x performance multiplier is small. I went from base PS4 to PS5, that's calculating (4x resolution + 2x framerate in some games) as an 8x increase. And even then we're only seeing cross generational games, so the same thing at higher res/Hz.
4k is better than 1440p and 60fps is better than 30fps, without a doubt, but they're not huge increases. Base Xbox to Series X has a ~10x multiplier, even before you take into account RDNA2 improvements that'll come later.
Even if you're a PC guy, you've got to be happy with the new consoles coz they'll push your favoured hardware further due to a new baseline (10tf instead of 1.3tf). There's more funding to make games look better than ever.
Edit: proportionally speaking, 2s loading compared to 0.5s is a much larger difference than 4k to 1440p.