Current Generation Games Analysis Technical Discussion [2024] [XBSX|S, PS5, PC]

Interesting spec chart. "Higher frame rates can be achieved with DLSS"

Err higher than what chief? There are no frame rates in the chart!

I assume its all 30fps though.
It's 30fps only on Series S and X with dynamic resolution. If the "Low" or "Medium" presets are roughly equivalent to console quality, then the chart is probably giving the requirements for 30 fps without upscaling at the specified resolutions and quality presets.
 
Should come with XeSS on Series X.
Lazy ATG
SMH
Prior to the most recent update that adjusted the scaling factors, XeSS on a 6700XT at Ultra Quality or Quality presets performed worse than Native resolution. Getting equivalent performance to FSR or UE5 TSR would require a lower rendering resolution, and I'm not sure the higher quality of XeSS's output can compensate for a lower quality input.
 
Yeah, can go into the 70's here. 4090, 4K DLAA, max settings. So the PS5 is checkerboarded 4K, perhaps there's some egregiously demanding Ultra setting that massively impacts performance without the equivalent visual boost, but discounting that, it would mean a 4090 is perhaps around ~2X PS5 performance here max (the PS5 version doesn't have an unlocked mode so it could be well over 60 in the same spots too). That's...not great?


On the plus side, very good frametime consistency it seems, which is the most important thing. Seems well multithreaded as well. I'll want to see scores from lower end CPU's and how it fares with 8GB cards, but I'd still consider a PC port that had excessive GPU demand relative to other titles as 'good' if it scales well. The truly awful ports are the ones where you can't adjust any setting to get smooth performance.
 
Last edited:

Oof. That is... really bad performance for an only marginally improved PS4 game? You would expect a 4060 to be comfortably over 60 FPS at native resolution. The visuals don't seem to have changed much.


Even worse for 4K. DLSS appears to be used as a crutch again...
Man, even Nixxes are incompetent now? Who is left that isn't a lazy, incompetent developer, who is only implementing DLSS as a crutch and not because it's just commonplace, easy, expected and useful?

smh

These are Max settings man. We shouldn't need to be hardcore PC gaming experts to know that Max settings can involve quite poor performance:visual gain ratio and shouldn't be used to judge a game's level of performance/optimization.

Also a 4060 is a low end part, and barely better than a 3060, which itself was only about as fast as a 2070. So it's not really this powerhouse of a card as you're making it sound.
 

Oof. That is... really bad performance for an only marginally improved PS4 game? You would expect a 4060 to be comfortably over 60 FPS at native resolution. The visuals don't seem to have changed much.


Even worse for 4K. DLSS appears to be used as a crutch again...
Not surprising. Nixxes has been getting some undeserved praise for a while. Their games are generally stable but the performance for the hardware is awful. There’s also the memory management problem.

I think this game struggles to hit 1080p/30/High settings on a 1060, a GPU that’s over twice the performance of a PS4 that runs it at 1080p/30.
 
These are Max settings man. We shouldn't need to be hardcore PC gaming experts to know that Max settings can involve quite poor performance:visual gain ratio and shouldn't be used to judge a game's level of performance/optimization.

Yeah maybe Alex will do optimized settings for this game.
 
Ghost of Tsushima has this insane indirect lighting system where they use a real-time compute shader texture compression library for their relit reflection probes once per frame ...

In their procedural grass system (page 22), the determined number of grass blades to render is stored in GDS memory and then there's another compute dispatch that writes out the indirect draw arguments ... (functionality equivalent to the Xbox version of ExecuteIndirect)

They also have a highly complex forward lighting pipeline that also supports multiple BRDFs like anisotropic GGX specular, anisotropic asperity scattering (anisotropic 'fuzziness' feature), and even subsurface scattering all of which may contribute to high register pressure/low occupancy ...
 
Ghost of Tsushima has this insane indirect lighting system where they use a real-time compute shader texture compression library for their relit reflection probes once per frame ...

In their procedural grass system (page 22), the determined number of grass blades to render is stored in GDS memory and then there's another compute dispatch that writes out the indirect draw arguments ... (functionality equivalent to the Xbox version of ExecuteIndirect)

They also have a highly complex forward lighting pipeline that also supports multiple BRDFs like anisotropic GGX specular, anisotropic asperity scattering (anisotropic 'fuzziness' feature), and even subsurface scattering all of which may contribute to high register pressure/low occupancy ...

Yeah, the notion that simply because a game was also available on the PS4 then it should therefore be a cakewalk to translate into the PC because we have GPU's far more powerful seems overly presumptuous. These are highly customized engines that were never meant to be multi platform, there are likely some considerable hurdles/implementation changes that need to be done to be mapped to a more hardware agnostic API.

Not surprising. Nixxes has been getting some undeserved praise for a while. Their games are generally stable but the performance for the hardware is awful. There’s also the memory management problem.

They are also the studio tasked with bringing the more difficult projects to the PC as well though. I mean Days Gone and Sackboy were also solid ports too (well, eventually) - but those are also UE4 games.

That being said, yeah I don't immediately see Nixxes as the porting studio and think the port will be golden out of the gate anymore, they're certainly not flawless. I would not have expected their ports would have similar GPU demands in the vein of Uncharted/TLOU either. However, as mentioned they are pretty diligent at actually addressing the most significant problems in PC ports, meaning shader/traversal stutter - The Last of Us's GPU rendering demands was actually the least of its problems. Maybe the lack of traversal stutter in their ports could be down mostly to the engines themselves sure, but they also put a good amount of work in addressing the shader stutter problem (albeit Sony as a publisher seems to place some worth in that as well, I don't think outside of the initial release of Sackboy any Sony game has noticeable shader stutter - perhaps Returnal at points?).

You'll want to wait for a few patches for these releases too, but they're also at least diligent in that regard and give considerable support to most of their releases long after release. Hell Spiderman was still getting patches to help with RT performance late last year.

Kinda funny that one of the best PC ports of a proprietary engine Sony game was the God of War port, and it was done by Jetpack, a Canadian porting studio - albeit I think they also assisted with Ragnarok, so they probably work closely with Sony. Wonder if the PC port of Ragnarok will still be DX11.
 
Last edited:
Not surprising. Nixxes has been getting some undeserved praise for a while. Their games are generally stable but the performance for the hardware is awful. There’s also the memory management problem.

I think this game struggles to hit 1080p/30/High settings on a 1060, a GPU that’s over twice the performance of a PS4 that runs it at 1080p/30.

I agree that they get a bit too much praise given the generally poor GPU performance in their ports, although to be fair, given they are all PS native games being ported to a totally different architecture (at least in API terms) perhaps that's to be expected.

I dont think we should be comparing to the PS4's performance though when we have a PS5 native version to compare against.

Also we don't know for certain that it's doesn't use DRS on console as far as I'm aware? The DF review from John said he didn't notice any but couldn't rule it out.
 
I agree that they get a bit too much praise given the generally poor GPU performance in their ports, although to be fair, given they are all PS native games being ported to a totally different architecture (at least in API terms) perhaps that's to be expected.

I dont think we should be comparing to the PS4's performance though when we have a PS5 native version to compare against.

Also we don't know for certain that it's doesn't use DRS on console as far as I'm aware? The DF review from John said he didn't notice any but couldn't rule it out.
The PS5 version has little to no visual improvements. It’s perfectly fair to compare to PS4 performance.
 
Yeah, the notion that simply because a game was also available on the PS4 then it should therefore be a cakewalk to translate into the PC because we have GPU's far more powerful seems overly presumptuous. These are highly customized engines that were never meant to be multi platform, there are likely some considerable hurdles/implementation changes that need to be done to be mapped to a more hardware agnostic API.
Despite being effectively last generation console games PC gfx APIs simply lag YEARS behind the freedom and functionality that console gfx APIs give ...

The depressing part is that the latest Sony PC ports where the recently proposed GPU Work Graphs "mesh nodes" extension in development potentially would've been really helpful to Horizon Forbidden West or Ghosts of Tsushima but that boat has clearly sailed. We need MAJOR functionality improvements to bindless as shown in Marvel's Spider-Man games since they can do upto 1 million descriptor updates per FRAME with RT enabled causing high CPU overhead and there's no solid timeline for when Microsoft intends to do this for D3D12. UE5 is starting to use global ordered append in their compute driven materials pipeline to optimize empty draw compaction and Media Molecule's Dreams renderer would've been another hard PC port due to the fact that they made use out of the feature too but there's no clear straightforward alternative on PC anymore since development on this functionality was shelved long ago ...

It's not fun when developers have to resort to performance reducing hacks/workarounds when they're shown right in front of themselves that they CAN do so much better on consoles ...
 
Nixxes has their hands full right now porting most Sony exclusives, they did Spider Man Remastered they went straight into Miles Morales, then Ratchet and Clank then Horizon Forbidden West then Ghost of Tsushima, then they have Spider Man 2 on their hands, and who knows if Last of Us 2 or God of War Ragnarok are theirs or not?

They are most definitely time constrained.
 
Ghost of Tsushima has this insane indirect lighting system where they use a real-time compute shader texture compression library for their relit reflection probes once per frame ...

In their procedural grass system (page 22), the determined number of grass blades to render is stored in GDS memory and then there's another compute dispatch that writes out the indirect draw arguments ... (functionality equivalent to the Xbox version of ExecuteIndirect)

They also have a highly complex forward lighting pipeline that also supports multiple BRDFs like anisotropic GGX specular, anisotropic asperity scattering (anisotropic 'fuzziness' feature), and even subsurface scattering all of which may contribute to high register pressure/low occupancy ...
Very excellent point. The methods used here were 'workarounds' for PS4 to get lighting not available by other methods like RTRT. Those techniques don't necessarily map well to the hardware (and middleware) being ported to. And here we get an important reminder that what's on screen does not tell you what's happening to get it there. A game may look worse and run worse than an equivalent on a platform without being incompetent. Indeed, it might be an astounding piece of code for its original purpose.

I guess as an extreme example of this, we have emulation. A 4080 can run a game far better looking that RFOM and yet it has to work hard to get that PS3 game working well because the code is not a nice fit for the hardware. Write from the ground up for the PC and far better visual results can be attained.

Just because a console was an AMD x64 SOC, doesn't mean the software developed for it is an ideal design to transition to a Windows PC. As the consoles become more like the PC, the delta between software design and PC hardware should diminish, resulting in PC ports performing proportionally better. Native PS5 exclusive games should port better.
 
Back
Top