So let me ask you. What happens when a game demands 1440p on XBSX?
We even saw such an example: UE5 tech demo (lets forget about SSD tech there for the argument). Ran on a PS5 with 1440p 30fps I believe?
So let me ask you. What happens when a game runs at 1440p on RTX 2080 Ti and you want to run it on a Radeon 5500 XT?
It's simple, you scale some things down until the 5500 XT runs the game fine at the resolution you want it to. It could even be 4k on the 5500 XT if you wanted. Not at the same IQ settings obviously, but that's the point.
You can
scale just about everything graphics related in order to hit the target performance you want at the resolution you want.
And XBSX - XBSS will be far FAR
simpler scaling than going from RTX 2080 Ti to Radeon 5500 XT as the latter has features that exist on one GPU that don't exist on the other GPU.
Only difference is that on PC, the user does the scaling. On XBSS, the developers will choose what is scaled down (resolution, IQ levels, etc.) So a 1440p XBSX title could easily run at 1080p on XBSS with the reduced resolution and perhaps reduced shadow quality, less dense foliage, or any other graphical tweak that the developers feel would represent the least noticeable difference.
Graphics are the ONLY thing that has to scale as the CPU is
exactly the same. So, physics, for example, wouldn't have to be touched. 3D audio wouldn't have to be touched. AI wouldn't have to be touched. Etc.
Again, if a developer can't target the XBSX and scale down to XBSS easily, then that is one fail developer.
[edit] Sorry if this should have gone into another thread. I was responding to a post that was a page or two before Brits post mentioning the scaling thread.
Regards,
SB