Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
I'm not necessarily saying I'm getting a flawless "no stutter at all" experience. Sorry for poor choice of words. I'd say %98 of the time it is smooth. There are hitches here and there when a new map is loaded or a new car shows up in the scene but it is definitely not constant enough to deem it problematic or immersion breaking. They definitely do not happen with higher end systems with more memory. But from your example, it even happens with medium textures, but they do happen even more rarealy on my end with high textures. So could be that my preventive measures in disabling steam's hardware accerelation and restarting dwm.exe may have a part in play to get a bit more smoother experience.

In my case, a framecap also eliminates problems in a major way. 50 FPS cap especially. Giving GPU some headroom seem to have a positive effect.
You're limiting your framerate with RTSS. The first thing I noticed when trying this out was that RTSS does not report frametimes properly with this game when you're capping the framerate. I did the same thing, and my line is perfectly straight like yours, but there's still hitching and stuttering.... it's not smooth at all. This same thing happens even when I limit it to 40fps (which I NEVER drop under) or hell even 30fps. At 30fps limit, there's STILL noticeable stutter in the way the game paces frames.

It's definitely problematic...

...because this game is EXTREMELY INCONSISTENT in every single way. You can be in a room just spinning the camera around and there will be uneven framepacing at completely random intervals... and then you can also be running through a much bigger environment, looting and shooting, and frametimes can be smooth... then again change for no apparent reasons.

There are serious issues going on under the hood here which completely ruin the framepacing in this game.. and these things MAY be less noticeable/frequent at the extremely high end... but they are 100% there.
 
You're limiting your framerate with RTSS. The first thing I noticed when trying this out was that RTSS does not report frametimes properly with this game when you're capping the framerate. I did the same thing, and my line is perfectly straight like yours, but there's still hitching and stuttering.... it's not smooth at all. This same thing happens even when I limit it to 40fps (which I NEVER drop under) or hell even 30fps. At 30fps limit, there's STILL noticeable stutter in the way the game paces frames.

It's definitely problematic...

...because this game is EXTREMELY INCONSISTENT in every single way. You can be in a room just spinning the camera around and there will be uneven framepacing at completely random intervals... and then you can also be running through a much bigger environment, looting and shooting, and frametimes can be smooth... then again change for no apparent reasons.

There are serious issues going on under the hood here which completely ruin the framepacing in this game.. and these things MAY be less noticeable/frequent at the extremely high end... but they are 100% there.
I'm not limiting my framerate with RTSS, I'm using NV's frame limiter. But it's not hitching and stuttering on my end. Video is there too. I cannot feel or experience them (aside from the obvious ones that occur). Mind you I play with a gamepad and this game has problems with mouse input. Just giving sme input. Another note I have to make ise I specifically use dualsense wired. I also heard xbox gamepads having troubles in this game. Could be something to do with input.

If not for the video's sake, I actually turn off afterburner too. I have given up on RTSS framelimiter a long time ago. It is quite incompatible with how crap my Zen+ CPU is. NV frame limiter is more forgiving. Most games will outright spit out stutters at me if I use RTSS.

I hope stuff is fixed in general though. I agree game is problematic. But I have to say reason game is bombed on Steam is bcoz it crashes a lot (also does not happen on my for some reason). I have no idea why I'm being graced. :D
 
Last edited:
Another thing wildly outside the norm in this game is the size of the shader cache folder (called psolibs). On NVIDIA GPUs the size of that folder is about 12GB! On AMD GPUs it's only about 150MB.
Are there any other games not putting the shader cache in the default directory? the default one in %appdata%\local\nvidia\dxcache on my machine is 5.5GB currently and im playing about 4 different games currently (sons of the forest/atomic heart/wo long/elden ring) it seems bizarre this games shader cache is double the size of 4 other games combined and it could well be more than 4 The Finals and some other demos have been installed played and uninstalled since my last driver update aswell. I have the driver set to unlimited and don't manually delete the folder.
 
Last of Us port looks like dogwater, unfortunately. It's almost like if you set the game on medium textures it's loading the textures but then rendering the wrong mip level.
 
Last of Us port looks like dogwater, unfortunately. It's almost like if you set the game on medium textures it's loading the textures but then rendering the wrong mip level.
It's almost as if Sony were incentivised to support and prioritise development of the PS5 version of the game, such as by using its best engineers and more of its resources.
 
It's almost as if Sony were incentivised to support and prioritise development of the PS5 version of the game, such as by using its best engineers and more of its resources.

I...don't get the point of your response. In other words, Sony didn't allocate the resources needed to properly port this game to the target platform. This much has been established, yes.
 
They rushed to capitalize on the success of the show. But yea. That’s porting for you.

It’s a 2 way street btw. If games are designed heavily for console going backwards without enough time it’s pretty shit on PC.

If a game is entirely designed with PC in mind and without enough time ported to console, it’s going to be shit on console.

Looking at Baldurs Gate 3 controversy.
 
They rushed to capitalize on the success of the show. But yea. That’s porting for you.

It’s a 2 way street btw. If games are designed heavily for console going backwards without enough time it’s pretty shit on PC.

If a game is entirely designed with PC in mind and without enough time ported to console, it’s going to be shit on console.

Looking at Baldurs Gate 3 controversy.
Cyberpunk 2077 is a great example of this.

But most games are developed around consoles these days. 🤷‍♂️
 
I...don't get the point of your response. In other words, Sony didn't allocate the resources needed to properly port this game to the target platform. This much has been established, yes.
Sorry, I was making a joke using the words Sony used to describe a hypothetical situation in which Microsoft could make Playstation look inferior by releasing a problematic port on PS5, even without any intended malice, by not allocating enough resources. And then, less than a month later, Sony releases this.
 
Sorry, I was making a joke using the words Sony used to describe a hypothetical situation in which Microsoft could make Playstation look inferior by releasing a problematic port on PS5, even without any intended malice, by not allocating enough resources. And then, less than a month later, Sony releases this.

Ahhh I gotcha. :)
 
I think that platform holders has level of influence in order:
First party
Some type of exclusivity
Marketing

Should have an element of quality control over it.

Just talking about texture quality at lower settings, recently we've seen that even on XSS studios not bothering taking time to do it. Which is also necessary to do for PC.

When it looks like they just dropped 2 highest quality lod levels and call it a day. Would even look bad on last gen.
Not a bug, not issue with streaming code, just not doing the work (for whatever reason)

Need AI down sampling tools to generate textures at different memory budgets/settings. Then scan play throughs during QA to optimize textures further based on visibility.
Obviously can't leave it up to people to do it any more.

8GB GPUs and even lower being sold still also in laptops, don't know what breakdown of ownership is but obviously biggest chunk of market.
So I find it pretty disrespectful to the consumer as it's not even a coding issue per se.
 
Just talking about texture quality at lower settings, recently we've seen that even on XSS studios not bothering taking time to do it. Which is also necessary to do for PC.
I think the thing that's telling in this port is that an 8GB card on PC will almost always have better texture quality than Series S. I can't think of a single example of texture quality being an issue on even an older 8GB card when trying to match Series S performance. I haven't used it in a little while, but I have an older computer with an 8GB R9 390 in it, and when tweaking for performance texture quality is one of the settings that affects it the least in most games. Obviously there are limits, but at 1080p or lower I've never had an issue matching or exceeding Series S texture quality. Not counting TLOU, since I don't own it, and don't plan on buying it.
 
I think the thing that's telling in this port is that an 8GB card on PC will almost always have better texture quality than Series S. I can't think of a single example of texture quality being an issue on even an older 8GB card when trying to match Series S performance. I haven't used it in a little while, but I have an older computer with an 8GB R9 390 in it, and when tweaking for performance texture quality is one of the settings that affects it the least in most games. Obviously there are limits, but at 1080p or lower I've never had an issue matching or exceeding Series S texture quality. Not counting TLOU, since I don't own it, and don't plan on buying it.
I hadn't seen the duplicate discussion going on in the other thread so I'll be careful not to derail this one.
XSS and 8GB gpus should be able to look pretty similar texture wise.

When we're seeing what we are on either it's not because it's hit it's technical limits that's the problem, it's the authoring pipepline, QA, etc.

It's far from what would expect to see on a medium/high quality setting it's more like may expect from low/very low.
 

Not really related to any game but i cannot find any thread about upscaling technologies comparisons. Very interesting video that shows weakness and strengths of different upscaling techs.
Surprised to see that DLSS and FSR2 so badly fall apart during motion, XeSS looks really really good.
It may depend on which DLSS version is in use. I have talked about it before on the channel (for example in the Quake and Half Life path tracing videos), but the standard SDK verion of DLSS 3.1.1.'s default mode is incredibly ghosty, honestly I think it is broken.

There are apparently submodes for DLSS 3.x? from "a" to "f" or something... and the default one - let us call it d - is just awful.
 
Not really related to any game but i cannot find any thread about upscaling technologies comparisons. Very interesting video that shows weakness and strengths of different upscaling techs.
Surprised to see that DLSS and FSR2 so badly fall apart during motion, XeSS looks really really good.

That's actually a really clever stress test and also shocked to see how poor DLSS faired compared to XeSS.
 
It may depend on which DLSS version is in use. I have talked about it before on the channel (for example in the Quake and Half Life path tracing videos), but the standard SDK verion of DLSS 3.1.1.'s default mode is incredibly ghosty, honestly I think it is broken.

There are apparently submodes for DLSS 3.x? from "a" to "f" or something... and the default one - let us call it d - is just awful.

Alex is worth playing around with DLSS versions to get a better image from using DLSS performance mode at 4k?

I'm happy with the image I'm getting but is there mush improvement in the image quality from changing DLSS versions?
 
Status
Not open for further replies.
Back
Top