Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

I'm just embarrassed on your behalf for this. Good lord.

The PS5 acts similarly to Proton on the Steam Deck, which translates Direct3D into Vulkan...because its OS is also based on BSD, which is a derivative of Linux?! Are you fucking kidding me.

Look, we all want civility here, and sure some responses to Michael have been over the line. But it's also difficult to take someone seriously who makes posts like this.
Only embarassing here is style of your posts that for sure not helping in discussion
 
Your reply is empty, you said nothing, Explain your point?

And explain what comprises the functions of Proton, if you think it is all DXVK you are incorrect and also elaborate on why the OS layer does not matter?
You have to connect the dots for us Michael. As part of any debate process, burden of proof is on the poster who makes the claim. He’s asking you to connect the dots, it’s not fair to ask the other posters here to connect dots for you since you’ve made the claim, we don’t know what you’re referring to.
 
There has been some extreme theories being made here just to give a positive light in favor of PS5, despite all the odds. Such care, thought and "analysis" is not being made whenever a PC part performs worse than expected.

I can, then, say all PlayStation exclusive ports so far run on a makeshift GNMX wrapper. I don't even need to prove anything, I guess. Therefore, all bad or unexpected PC performances so far are now justified, due to this theoritical wrapper I just created out of thin air. If the God of War, Death Stranding, Spider-man, UC4 runs on this theoritical GNMX wrapper created by their porters, PC software, DX12 or architecture of PCs have nothing to do with outlier performance deficits that can be observed at certian times in these games. I mean, they run on a GNMX wrapper. So yeah. All those "arguments" on how how PCs are bad, PC software is bad, how architectural differences are causing the performance deficit is rebuted, by this logic. I mean, if it "THEORITICALLY" runs on a wrapper, it is not even taking advantage of the said architecture. ;) It was the wrapper all along, people.

Burden of proof, of course, is not on me. I don't care, after all, I just make the claim. It fits my narrative (now). Unless you manage to make your way into the confidential port process these devs have and ask them directly, you will never know. None of those games are native to DX12/DX11 APIs, anyways.
 
I don't think personal attacks like this are appropriate. Nxgamer could be a valuable contributor here but if we just keep the personal attacks, he won't bother posting here anymore. He has extensive knowledge, good coverage, a youtube channel, and works for IGN. His contributions could be valuable so let's try to treat him like we treat everyone else and not like some kind of enemy.

I'm not backseat modding here but those kinds of answers are what drive people away from here.
Ideally the community would be self moderating and not need Moderators to manage conversations. ;)

In this instance, I read Henry Swagger as being unaware NXgamer is present and posting on account of having a different user name and no reason to connect them. Michael Thompson could do with a .sig that mentions who they are.
 
Your reply is empty, you said nothing, Explain your point?

"So, if you look at that from a holistic view and not through a PoV of a Platform or defensive warrior, you will see the dots that I do not need to connect.....hopefully!"

You literally said you don't need to explain your point, just said that I was a 'platform warrior' for asking 'what is the basis to even speculate on such a thing for a major shipping product for a console with a well established development environment?" wrt to potential 'shader stuttering'.

And explain what comprises the functions of Proton, if you think it is all DXVK you are incorrect and also elaborate on why the OS layer does not matter?

So what, the fork of Wine libraries being a part of Proton makes this more plausible?

What is actually being implied here? I mean we have to keep trying to suss this out because you simply won't come out and say directly what your theory is. That Sony actually either worked on their own fork of Wine, and also has the underpinnings in their OS to actually facilitate an emulation layer for Windows x64 binaries? Sony developed a fully functional Vulkan driver for their custom BSD variant as well? The "OS layer" part of this is a fraction of what makes Proton possible, which is why we didn't see anything remotely this functional until Valve had the impetus to contribute their resources to get it into this state for the SteamDeck. It's born through years of focused development effort on their part and built on the back of decades of work with Wine and years of pressure on Nvidia/AMD to get their Vulkan drivers into a polished state.

Again, the claim isn't that Sony's OS is based on a linux kernel, or that Sony also has higher/lower levels of API's. Everyone knows this. The claim you seem to be making is that it's actually interpreting compiled Windows/DirectX code in real-time for this game.

That is an extraordinary claim, it requires extraordinary evidence.

My ATM took quite a while to give me my cash yesterday. It runs on Windows embedded, so I can only guess it was some sort of shader stutter.

Only embarassing here is style of your posts that for sure not helping in discussion

You need to take a more holistic view of my posts, then connect the dots.
 
Last edited:
Expressions of incredulity aren't conducive to technical discussion. One of the downsides to this sort of forum is a delay between response, meaning we tend to advance half the argument ourselves by debating what we think was said in the absence of correction/clarification/retraction. In direct discourse, one would simply ask for clarity (and only express incredulity when the clear, unequivocal response is incredulous :p).

Michael presented a rather uncertain sentiment that many of us are scratching our heads over. Let's leave it until he replies with a clear explanation of what he thinks is happening instead of trying to guess and responding to those guesses.
 

Somewhat odd to go to the trouble of including 4 separate GPU's in your test (not counting the Steam Deck), but only vary the DLSS quality between them. A 3050 owner will not be running Ultra Ray traced reflections, doesn't really reflect (cough) the performance you'll be getting for the PC version. It's like benching a game using its supersampling setting set to max on a 1060.


1667155597426.png


For perspective, a 3080 at 4k native with High raster settings gets you 120-140 fps, 4k native all raster settings max + RT shadows/AO gets 75-90fps. Halve that for RT reflections, the halve it again for RT Ultra reflections.
 
I think there's a little miscommunication here, because Michael is using some kinda stretchy analogies, but if I understand correctly he means a wrapper in the simplest (and most common) sense for a core system like rendering: a single interface the game code uses with commands like "submit this thing to draw" and "get access to this sampler" or whatever, where that function then calls the platform specific rendering API (and also does whatever setup, memory management, different-per-platform order of operations, etc, which could introduce overhead.) This is, like he said, extremely common for rendering.

What's uncommon to the point of being almost impossible to believe is this causing such a big performance issue. A person could certainly write a wrapper so bad that it like, was basically a 1:1 feature set to dx12, but terribly mismatched to the ps5 feature set such that it ran really slow, but doing that would break everything.

It's not like the slow scenes are a little bit slower, as if there's a tiny but acceptable bit of added overhead they never put in the work to improve, and it's also not like the slow scenes have some new otherwise unseen rendering features as if those particular workflows are very different in the ps5 api and their wrapper doesn't successfully take the fast path.

I think this theory is a credible explanation for a lot of games' performance differences -- especially the early crossplatform games that ran just a little bit less stabley all the time on xsx. For this game the theory doesn't match the results at all.


Also -- I didn't see him say that the ps5 version had to recompile shaders, but if he did that's definitely not how it works on fixed platforms like the ps5 or xsx.
 
Somewhat odd to go to the trouble of including 4 separate GPU's in your test (not counting the Steam Deck), but only vary the DLSS quality between them. A 3050 owner will not be running Ultra Ray traced reflections, doesn't really reflect (cough) the performance you'll be getting for the PC version. It's like benching a game using its supersampling setting set to max on a 1060.
This is what always happens. Every card gets tested at the same settings, so max settings regardless if its practical or not.

This is why I'm not to keen on these performance destroying ultra options. It makes people without technical knowledge think that the game straight up cannot run well with RT on lower end cards. And people like Steve from Hardware Unboxed parrot that thought without questioning and then come to the conclusion that "everything below 3080 is not suitable for RT" or that "the 2060 Super has no advantage to the 5700XT because RT is unuseable anyway." And of course, their community will then parrot that as well without a second thought.

It really annoys me.
 
Also -- I didn't see him say that the ps5 version had to recompile shaders, but if he did that's definitely not how it works on fixed platforms like the ps5 or xsx.

"They almost always present as shader compilation stutters on PS5, and this is a DX12 title. This could be a viable choice for the team to use a higher level of API abstraction that closely mirrors the DX API".

You could possibly interpret his meaning here that these stutters are 'like' shader stuttering on the PC, but it's a very odd way to describe stalls in general- it's actually a problem even with people describing any stalls they experience on PC games as being solely due to shader stuttering, let alone a fixed platform that has no reason to ship with uncompiled shaders and something console users have likely never experienced. That alone, but then he follows that up with explaining that the team may be using a 'higher level of API' gives the impression he is indeed talking about the PS5 port needing to compile shaders as a result of this higher level of API abstraction.

I mean who knows, we're still reading tea leaves here to some extent. We could get a clearer explanation of what his intent was, but I guess that wouldn't be 'holistic'.

Also earlier, there's less "This could be a possibility of what's happening", and something far more definitive:

"I believe we have a mixture of data and GPU bound moments, and certainly API related".
 
Last edited:
This is what always happens. Every card gets tested at the same settings, so max settings regardless if its practical or not.

Well in this particular case though they're not even doing that - they're adjusting the resolution as well as the DLSS quality for each tier of card. It's a very strange way to demonstrate scalability by only adjusting those but keeping things like max RT on.

This is why I'm not to keen on these performance destroying ultra options. It makes people without technical knowledge think that the game straight up cannot run well with RT on lower end cards. And people like Steve from Hardware Unboxed parrot that thought without questioning and then come to the conclusion that "everything below 3080 is not suitable for RT" or that "the 2060 Super has no advantage to the 5700XT because RT is unuseable anyway." And of course, their community will then parrot that as well without a second thought.

It really annoys me.

Yeah, I've raised this concern before as well. It's great to have scalability in PC games, it's one of the key advantages of the platform after all. I want to have Ultra options that can deliver a better experience when I get new hardware and revisit it without begging devs to provide an upgrade patch to take advantage of your new hardware.

But developers need to actually provide a meaningful visual benefit to warrant the performance cost. Just upping the precision for the sake of having an 'Ultra' setting that doesn't actually provide tangible visual differences other than halving your frame rate is ultimately just giving your game unnecessarily bad PR.

God of War is a good example. On my 3060 at 4k, I have to run with original settings, aside from textures/aniso at max, to maintain 60fps (mostly) at DLSS balanced. I'll likely upgrade my GPU in the coming year, which will allow me to run at Ultra settings. But here, they actually mean something - Ultra shadows produce a very tangible upgrade, to the point where they can significantly alter the lighting of entire scenes. They're costly, but you see why. The game doesn't look like shit without them, but it looks so much better with them. That's scalability. Ultra means something here other than just a word representing a numerical value.

I would defy anyone to tell the difference between RT reflections and Ultra RT reflections in Sackboy without very zoomed in, side-by-side comparisons - and even then it's difficult. Again, give me those system-destroying options - but you have to actually make the advantages they bring visible. If you can't, then I see little point to include them. Like really, benchmarking Sackboy with 200% res scaling would actually make more sense from the perspective of using a setting that actually provided a visual benefit. :)

(Also drawing attention to these meaningless Ultra settings draws attention away from other performance aspects of the game which actually affect everyone regardless of GPU, such as the shader compilation stutters. Something that sites like Hardware Unboxed should have actually been talking about long before the 'problem' of the high cost of RT)
 
Last edited:
I think this theory is a credible explanation for a lot of games' performance differences -- especially the early crossplatform games that ran just a little bit less stabley all the time on xsx. For this game the theory doesn't match the results at all.
I think its only interesting in fact that ps5 drop more fps in data loading scenes and loading times are worse on ps5 so for sure Asobo didnt bother with usage ps5 io system correctly but that doesnt of course mean they uses some kind of wrappers.
 
I think there's a little miscommunication here, because Michael is using some kinda stretchy analogies, but if I understand correctly he means a wrapper in the simplest (and most common) sense for a core system like rendering: a single interface the game code uses with commands like "submit this thing to draw" and "get access to this sampler" or whatever, where that function then calls the platform specific rendering API (and also does whatever setup, memory management, different-per-platform order of operations, etc, which could introduce overhead.) This is, like he said, extremely common for rendering.

What's uncommon to the point of being almost impossible to believe is this causing such a big performance issue. A person could certainly write a wrapper so bad that it like, was basically a 1:1 feature set to dx12, but terribly mismatched to the ps5 feature set such that it ran really slow, but doing that would break everything.

It's not like the slow scenes are a little bit slower, as if there's a tiny but acceptable bit of added overhead they never put in the work to improve, and it's also not like the slow scenes have some new otherwise unseen rendering features as if those particular workflows are very different in the ps5 api and their wrapper doesn't successfully take the fast path.

I think this theory is a credible explanation for a lot of games' performance differences -- especially the early crossplatform games that ran just a little bit less stabley all the time on xsx. For this game the theory doesn't match the results at all.


Also -- I didn't see him say that the ps5 version had to recompile shaders, but if he did that's definitely not how it works on fixed platforms like the ps5 or xsx.
It’s unreal engine 4. Not sure why any sort of wrappers are needed. It deploys on all platforms.
 
A Plague Tale: Requiem is not Unreal, its a proprietary engine. Unless I'm misreading something and you're talking about something else?
Yep, it's a common misconception with this game series and engine. It looks VERY Unreal Engine 4 like. I'm even guilty of this myself as I at first also thought it was Unreal Engine, but it's Asobo's own proprietary engine.

The fact that people are easily confusing the two simply based on the type of visuals on display is quite the accomplishment for Asobo.
 
Back
Top