During the PS2 era, this made a lot more sense because all three consoles had very different architectures. Nonetheless, the Xbox was still far more powerful and more often than not got the better version of multiplats. Hell, it wasn’t even uncommon for the GC to have advantages over the PS2 in some games.It's always been like and there's been various developers confirm it over the year.
There was a develop in here years ago that said PS2 got the most time and effort as the install base was huge and Xbox and GC time was to get the games to a stage where they were 'good enough' - I'll have to try and dig post out for you.
The platform with the largest install base, and thus with largest potential for sales is the one that gets the most attention in the development process.
PS5's install base is touching 2:1 now so it's in developers best interests to ensure that version is the best it can be as it gives them access to the most sales potential, and thus the potential for the most money to be made.
What I do want to know is, how much can we really attribute to "they optimize more for the PS5" to the current performance disparities. Additionally, I don’t think this reasoning holds much water either because the differences are generally so minimal that the performance profile of either console isn’t enough to sway a buyer one way or the other, or even hurt the potential sales of one platform. The PS5 running 15% worse wouldn’t impact sales much.
Seems to be a self-fulfilling prophecy to speculate that the Xbox will run better due to partial specs and then go "Well, PS5 gets more optimized anyway," when those advantages don’t materialize. No one can prove or disprove these claims so they don’t seem particularly useful. This puzzles me further when consoles don't even hold that much of an advantage over similarly specced PC parts like they used to. Neither console seems particularly well optimized the majority of the time.
Last edited: