Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
I am not comparing to Xsx performance.

I just find it ludicrous that if the game is performing at 1440p after CB, that it can do the same natively after removing CB.
xsx use dynamic res 2160p - 1440p so ps5 should use same with little lower down scale (quality mode xsx 2160p - 1980p, ps5 2160p - 1872p, same perf. same drop in same testable place)
 
Last edited:
Don't think so. The PS4Pro runs at 1440p dyn. This runs at 4k60 and 1080p/120. Only native PS5 games run at up to 120fps so this is a native PS5 game.
THPS only has a PS4 build on PSN and installs to the external SSD which I believe makes it an enhanced PS4 game?

Am I wrong or have Sony modified one of the BC modes to accommodate accessing PS5 hardware accessibility in PS4 games?
 
NXGamer is saying the CBR is not ideal here as the game is mostly locked 60fps at max res: 2160p CBR. In 15 heavy action scenes he found 10 frames at 2160p 60fps and only one at 1576p CBR.

He is saying with that headroom they should have used a native + DRS rendering particularly with that CBR which often breaks and shows plenty of ugly artifacts. According to Dice CBR is not free and saves only 30% of rendering time compared to native res.
 
  • Like
Reactions: snc
XSX pulling ahead more and more as the generation continues. I dont think anyone should be suprised anyway, XSX has the more capable GPU (9/10TF vs 12+TF gpu), faster CPU, more BW and no downclocking between cpu/gpu and finally more RDNA2 features for efficiency. Also, a wider GPU should fare better in RT or or compute oriented engines like UE5 going forward.
 
i think a lot more is going on CPU wise in Call of Duty 120fps mode than in tony hawk remaster, and CoD 120fps mode runs at 1080-1200p on both XsX and PS5.
 
Recall:
PS4-GPU-Bandwidth-140-not-176-635x358.png


Available bandwidth is likely the largest factor here.
Consoles need to compete with CPU bandwidth and PC's don't, so making cross comparisons to PC benchmarks may not always work if you're looking strictly at clockspeeds/TF. Except in scenarios where bandwidth is not the bottleneck.

If you have a CPU that say normally takes 10GB/s at 30fps (heavier CPU titles will eat more, more NPCs = more CPU action), it becomes 40GB/s at 120fps. That combined with the disproportional way that the more bandwidth CPU eats up, there is even less for GPU as per the older PS4 slide, the GPU is starved for bandwidth.

This is likely the main culprit for resolution loss before you look at clock speeds, fill rates etc.

So 448GB/s becomes 408GB/s off straight subtraction, then take even more off even more for disproportion and you have a GPU that is unlikely able to hold higher resolutions at higher frame rates. You're going to be well sub 380GB/s.

Then you couple that with higher CPU usage pulling back any sort of smart shift towards the GPU, and you're in a scenario where PS5 cannot produce superior results compared to something like 5700XT.

5700XT will have all 448GB/s to itself. CPU completely unaffected.

These are typical challenges for game developers working with consoles, and not every developer is capable of maximizing the hardware for every games the same ways as some of our best AAA teams can.

4K60 and 1440P@120 are hard targets for consoles to hit (due to the sharing with CPU factor) and will increase in difficulty as the generation goes on. Aside from memory footprint concerns, which SSDs mitigate, there has not been much added hardware to mitigate the bandwidth bottleneck. The more computation you are doing, aka, saturating the CUs, ultimately means you're pulling more bandwidth as you've got to write the results somewhere. The bandwidth will likely be the ceiling here for both. they will need to come up with creative ways to reduce the bandwidth pressure, and that likely means some creative up-sampling techniques.
 
Last edited:
If I was aware of this, I've forgotten. What's the reason?
1440p isn't part of the HDTV standard. PS5 (and 4) only support the television standards.

Ps5 can output at 1440p, that's the resolution of demon's souls remake at 60fps for example.
While I'm sure the hardware is capable of it, there is no support via the OS or otherwise that lets you output 1440p.

It is game console, 1440p support is a gimmick.

If someone chooses that they only use tech not made for consoles, it is their fault. Monitors are for computers, simple as that.

If 0.01% games on monitor, it is their right but they have no right to complain.

Same as complaining that ps5 dont support scart.

It's like some users make these issues on purpose and then blame others, weird.
There is utility in 1440p support. And there are monitors that are 1440p marketed for console use. So it isn't like there is no support from the display industry.
 
There is utility in 1440p support. And there are monitors that are 1440p marketed for console use. So it isn't like there is no support from the display industry.
So many gamers game on a table and a monitor. It's a great experience with high utility (mobile, compact), useful for web browsing, gaming, and media, with very low real-estate costs ( a table for monitor, console, mouse and keyboard, headset, a camera and mic if they require a streamer setup ).
If all 200M console players all had 65" screens, playing on a couch I would be shocked. I'm pretty sure the 65" OLED 4K@120FPS represents less than 5% of the current generation of gamers and that might be generous. A majority of the 4K gamers in this generation are likely continuing to use their display panels from when they upgraded to mid-gen refresh. My 65" LGB7 cost me 3500 CAD. Completely dwarfing the cost of any console. It's not a reasonable as by any measure. Looking back I could have bought a new gaming PC with top of the line hardware and both next gen consoles combined for the cost of this screen.

1440p is a small but growing marketshare in a dominance of 1080p monitors. The PC gaming space continues to move into wide screen setups, which I'm insanely jealous of, because of their advantage in FPS titles.
 
While the display output resolution support of the PS5 discussion is fascinating I don't know what that has to do with the internal rendering resolution of a game. Games render at an arbitrary resolution, determined by the developers and this is then scaled to whatever display resolution is supported by the console and the TV.

I am not even sure why this is a conversation at this moment just because the PS5 version of THPS renders at 1080p/120 while the XSX renders at 1440p. I mean you can debate, argue, and have a conversation about why the developers made such choices for the two consoles but to say it is because the PS5 does not support 1440p as an output resolution flies in the face of all the games release since the launch of these consoles, more or less.
 
I mean you can debate, argue, and have a conversation about why the developers made such choices for the two consoles but to say it is because the PS5 does not support 1440p as an output resolution flies in the face of all the games release since the launch of these consoles, more or less.

It's literally always the same 2 posters doing these gymnastics and pulling out unrelated factoids to try to prove the ps5 has a unfair disadvantage. Same people who saw two launch titles outperform the xbox and wont stop talking about clock speed being magic, or the console having a secret infinity cache, or tflops plus bandwidth plus memory not meaning anything. Just ignore them.
 
Status
Not open for further replies.
Back
Top