Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
NVIDIA GPUs are severely underperforming in this title as well (just like Intel GPUs), their power consumption is significantly lower than normal. If Intel can do it, NVIDIA could do it as well.

Considering how quick NV are to provide performance drivers for their latest GPUs in the latest games, if they haven't done so by now, I'm not sure I'd hold my breath. It's not like Starfield is a small unknown indie title which may get a performance uplift some months down the road.

Regards,
SB
 
Were UE engine stutter compilation issues so prevalent in the past? It seems to have really blown up in the past 3-4 years. I recall seeing it here and there (such as in Arkham City which is UE3) but now it seems every damn game using UE is having that problem.
 
Were UE engine stutter compilation issues so prevalent in the past? It seems to have really blown up in the past 3-4 years. I recall seeing it here and there (such as in Arkham City which is UE3) but now it seems every damn game using UE is having that problem.

It's a multifaceted phenomenon. It's always existed as long as we've had shaders. However:
  • In the past PC developers would spend more time explicitly dealing with the issue to mitigate its effects.
    • For example, pre-compiling shaders on first run of a game was relatively common 10 years ago, it's pretty uncommon now.
  • Much more attention is now paid to it when it does end up in a game.
    • For example, instead of people just complaining about a bad performing game, they know what shader studder looks like due to media attention (especially DF).
    • Therefore, instead of misattributing shader studder to something like bad frame pacing, random stuttering, etc. they know how to see if it might be shader studder.
  • Lastly, shaders are much more complex now. RT increases that shader complexity massively as well.

Regards,
SB
 
Were UE engine stutter compilation issues so prevalent in the past? It seems to have really blown up in the past 3-4 years. I recall seeing it here and there (such as in Arkham City which is UE3) but now it seems every damn game using UE is having that problem.
No because everyone would just use DX11.
 
Lords of fallen reviews are in and looks like another "editor" dev thinking unreal engine will make everything under the hood sing for them. If shitstorm commences, I bet they will have convenient hush hush passive aggressive excuses about some hardware incapability. Meanwhile on screen we see hitching on all platforms like code is dutce taped together barly holding. All of said gameplay and scope perfectly possible even two gens ago on a hardware with orders of magnitude lesser CPU an IO. The even have greeting card on twiterr with tanks to influencers for helping patching the game for ps5 and pc(xbox will be patched in future...) and influencers are saying ehhh thats not patched at all.

Have most performance oriented programmers left the industry and we have shameless PRogramers or what.
 
Lords of fallen reviews are in and looks like another "editor" dev thinking unreal engine will make everything under the hood sing for them. If shitstorm commences, I bet they will have convenient hush hush passive aggressive excuses about some hardware incapability. Meanwhile on screen we see hitching on all platforms like code is dutce taped together barly holding. All of said gameplay and scope perfectly possible even two gens ago on a hardware with orders of magnitude lesser CPU an IO. The even have greeting card on twiterr with tanks to influencers for helping patching the game for ps5 and pc(xbox will be patched in future...) and influencers are saying ehhh thats not patched at all.

Have most performance oriented programmers left the industry and we have shameless PRogramers or what.
You think this game as it exists right now was possible on a PS3/X360? I think you should head back to r/pcgaming.
 
Lords of fallen reviews are in and looks like another "editor" dev thinking unreal engine will make everything under the hood sing for them. If shitstorm commences, I bet they will have convenient hush hush passive aggressive excuses about some hardware incapability. Meanwhile on screen we see hitching on all platforms like code is dutce taped together barly holding. All of said gameplay and scope perfectly possible even two gens ago on a hardware with orders of magnitude lesser CPU an IO. The even have greeting card on twiterr with tanks to influencers for helping patching the game for ps5 and pc(xbox will be patched in future...) and influencers are saying ehhh thats not patched at all.

Have most performance oriented programmers left the industry and we have shameless PRogramers or what.

You have no idea how complex modern game development is.

This isn't about programmers.
 
So I'm playing Rain Code right now and I'm super impressed. Not only its a fantastic game, but IMO it's a graphical showcase on Switch.
2023101615420800-65A9FA6CA460E1A368EF734B18EDC20B.jpg


This is one of the few games on Switch that uses screen space reflections, even in undocked mode. It looks even better on screen (the capture quality of Switch is quite low sadly)

F0DV-9WakAIUopi


Texture quality, the baked lighting and AO and especially materials would not feel out of place on the PS4/Xbox One at all.

16972967379897246519340307913332.jpg
 
You think this game as it exists right now was possible on a PS3/X360? I think you should head back to r/pcgaming.

You have no idea how complex modern game development is.

This isn't about programmers.

Regarding scope and gameplay? for sure, it is dark souls clone after all and not that much have changed here, but cpu/ram/io capabilities rised significantly. Graphics assets/slideres "as is" are obviously not, but i was told and belive nanite would solve LOD headaches for devs. Devs have wonderfull apis, profilers, middlewares at their disposal, ram/cpu /storage is pretty comfortable compared to old times, yet even console (i mean not related to pc shader compilation situation) this is ridiculously hitching like code barley holding together, buckle would be more correct word. I do/t see anything on scree that would make it hitch like this.

I know development is complex. Pardon simplistic narrative but it really feels like attention to technical craft is more and more afterthought these days.
 
I know development is complex. Pardon simplistic narrative but it really feels like attention to technical craft is more and more afterthought these days.

Technical craft isn't so much an afterthought so much of one of many things that impact on how a developers work turns out. And even if they are good at everything, a decision somewhere miles away or months earlier or even later can make their best efforts appear terrible.

Abstraction is very powerful, as are layers of management to handle huge development teams, but they come with costs and risks. In the old days a programmer might know everything going on in a game, and if they didn't they could probably directly ask the person responsible and be at most one person away from the code or the issue they need to understand or troubleshoot.

Now, they could be in another team, in another timezone, on another continent away from their equivalent. You might never get to speak to them. And even if you did (and they spoke the same language), that person might be several layers back from the hardware in an engine made by another company where they can never speak directly to the team responsible. The layers of people and communications and divisions and companies involved can be very deep.

Management at all levels is probably as important these days as technical ability. The best technical minds in the world can't program around a dysfunctional organisation - the best engine will struggle if asset budgets are blown, the best programmers can only do so much if the goalposts keep moving, the best QC team will fail if insufficient time and resources are put into making sure that the quality is there. And messy, hacky, poorly documented code done to make release of a troubled game will become a liability if management don't ensure time to go back and rewrite and re-document after launch (seems like the Halo engine suffered from years of this or something like it).

A bit of an aside, but one of the benefits of being a fairly lean and well funded developer, and having your own engine, seems to be that you have a very strong connection between the core technology and a particular game. I think this helps the likes of Remedy, Insomniac and iD to deliver technically strong titles that are pretty solid at launch (or at least pretty shortly after).
 
Had completely slipped my mind this was mainly a DX12 problem.

Eh, sort of - it's more prevelant on DX12 (and partly simply because modern games that would be in DX12 are also using far more material shaders than older games), but it certainly happens in DX11 as well. DX11 can help in some games over DX12, but in others (such as Borderlands 3), if they're not doing any shader precompiling at all, they'll still have prominent stutter that's basically indistinguishable from the kind of stutter you associate with DX12.

As @Silent_Buddha mentioned, old games had shader stutter too - but they just 'got away' with it because we didn't know what it was, or the shaders were simple enough and far less numerous that the driver could often compile them with minimal frametime impact (and we weren't playing at 120fps as commonly back then too).

The aforementioned Arkham City is a perfect example. If you look at early tweak guides/reviews for the PC version, it was 'install on the fastest SSD you can to minimize stuttering!', or more commonly - just avoid the DX11 features altogether, which was still the default recommendation even from a few years ago. While UE3 also had traversal stutter which was an issue in this game as well, the main problem - at least when you first install it - was a ton of shader stutter in DX11.

You can completely fix it though, using DXVK-Async. DXVK also helps with traversal stutter in some titles so it's not just shaders, but the async patch basically negates 99%+ of the stutter, so now it's pretty much a locked 60fps on a HDD - SSD is irrelevant, as IO throughput was never really the issue.
 
Regarding Desodre, software Lumen vs Path Tracing.


So very expensive, but very nice. On Nvidia, anyway. Now imagine the frame rate with RDNA2/3 HW Lumen, and image quality with FSR 2.1.

For about the last 20 years the gap between consoles (and PC GPUs) has mostly been fairly once dimensional where you're doing mostly the same stuff but at different levels of performance. But.....

...now there's a gulf opened up between [consoles + AMD PC] and Nvidia* where there are entire dimensions of rendering that are at best pale comparisons on AMD hardware and at worst practically impossible. Heavy use of RT is a no-go for AMD, and image reconstruction is a league behind DLSS any time you lean on it heavily. So I guess this leaves the question for developers of which range of solutions do they simultaneously support to solve the same basic issues (e.g. lighting, reflections), and how far do they try and push consoles. Or maybe they just say fuck it and gimp the RT features on PC. *shrug*

I think the rest of this console gen (and it's cross gen period) could be a bit of a bumpy time as consoles remain unable to deliver features that PC gamers increasingly demand. Hopefully this next cross gen period will be mercifully short so we can get onto strong RT hardware and better reconstruction tech.

*also Intel but we don't talk about 'em round 'ere. 👻
 
Last edited:
Another hot performer:


I knew there were concerns about performance, but I didn't expect this. I thought it would be some very heavy CPU limits, and while CPU limits do seem to happen around 78fps on a 7800X3D with what they tested here, shockingly the game is massively GPU limited in most situations unless you turn things down to lowest settings. At High settings, and just 1080p, a 4090 is only getting.....get this.....38fps. lol A 3080 is only at 24fps.

This makes Jedi Survivor look like an amazingly optimized game by comparison. This is one of the worst performing major releases we've had in a decade or more. What a shame cuz the game genuinely seems pretty great otherwise. Devs have promised performance improvements as quickly as they can get them out, but lordy. lol
 
This about Cities Skylines 2 specifically.. but I think when benchmarking sites run into very obvious CPU limited scenarios, where you have a clump of GPUs performing the same.. they should also put the GPU usage % stat along with them. It gives you an idea of just how limited a given GPU is by the CPU in that scenario.

Of course that doesn't tell the whole story, as a game like Cities Skylines 2 very clearly and admittedly has optimization issues which may not be reflected in the usage stat specifically... but I still think it's useful to know just how much the GPU is being utilized.
 
Another hot performer:


I knew there were concerns about performance, but I didn't expect this. I thought it would be some very heavy CPU limits, and while CPU limits do seem to happen around 78fps on a 7800X3D with what they tested here, shockingly the game is massively GPU limited in most situations unless you turn things down to lowest settings. At High settings, and just 1080p, a 4090 is only getting.....get this.....38fps. lol A 3080 is only at 24fps.

This makes Jedi Survivor look like an amazingly optimized game by comparison. This is one of the worst performing major releases we've had in a decade or more. What a shame cuz the game genuinely seems pretty great otherwise. Devs have promised performance improvements as quickly as they can get them out, but lordy. lol

According to reviews the game is more of a large first game patch with any DLC you had removed so you have to rebuy it over the coming years. Many would probably be better off forgetting this game exists.

Damn I miss Maxis and Will Wright.
 
High integrity of them to put out a statement saying performance will be bad. Imagine if this level of integrity became widespread throughout the industry. I do wonder what these GPU cycles are being spent on.
 
So very expensive, but very nice. On Nvidia, anyway. Now imagine the frame rate with RDNA2/3 HW Lumen, and image quality with FSR 2.1.

For about the last 20 years the gap between consoles (and PC GPUs) has mostly been fairly once dimensional where you're doing mostly the same stuff but at different levels of performance. But.....

...now there's a gulf opened up between [consoles + AMD PC] and Nvidia* where there are entire dimensions of rendering that are at best pale comparisons on AMD hardware and at worst practically impossible. Heavy use of RT is a no-go for AMD, and image reconstruction is a league behind DLSS any time you lean on it heavily. So I guess this leaves the question for developers of which range of solutions do they simultaneously support to solve the same basic issues (e.g. lighting, reflections), and how far do they try and push consoles. Or maybe they just say fuck it and gimp the RT features on PC. *shrug*

I think the rest of this console gen (and it's cross gen period) could be a bit of a bumpy time as consoles remain unable to deliver features that PC gamers increasingly demand. Hopefully this next cross gen period will be mercifully short so we can get onto strong RT hardware and better reconstruction tech.

*also Intel but we don't talk about 'em round 'ere. 👻
The vast majority of gamers need good-looking games, and this generation of consoles is perfectly suited for that.
RT would not help the game development of our time, but rather hold it back. In principle, it is brought into game development to simplify programming through the automation of dynamic lights, but in reality it requires more optimization than previous techniques, and that is precisely why RT is forcing game development to drag on.

RT is not a solution, because you can create great reflections and shadows with much less resource requirements
methods. RT is an effort and not a necessity for the gaming industry. What is necessary is to create development tools that facilitate the development times required for games in our time, so that more energy remains for the artistic part of graphics. the artistic part of graphics, the quality of assets and textures is much more important than such a resource-intensive feature that holds back game development.

Hopefully this generation of consoles will last a long time with the current hardware, the over-the-top developers will exercise self-control and make as many beautiful games as possible for these consoles. The audience of one hundred million players needs this.
 
Last edited:
Status
Not open for further replies.
Back
Top