Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
If only they offered a way to play all their games for cheap after some time passes, like a $30 a year subscription service...
Is that really a good excuse for publishers to become worse about terrible launches that need tons more patching? If that's the case its a bad thing
 
Here we go again ... But please EA explain to me why i should pay 70+ usd for unfinished game that requires additional 3 months or so patching to get into playable state .....
EA? Pretty much every game these days has issues now that takes time to patch out. EA is hardly even one of the worst offenders in this regard lately.

Also, predictable seeing people getting preemptively outraged based on one early anecdotal report. I swear like 95% of gamers are complete reactionaries.
 
Is that really a good excuse for publishers to become worse about terrible launches that need tons more patching? If that's the case its a bad thing
Not an excuse.

It's a means of getting the most for your money. There's no need to play single player games at launch. There's far too many great games out there to feel you have to jump in on a new title on Day One.
 
Not an excuse.

It's a means of getting the most for your money. There's no need to play single player games at launch. There's far too many great games out there to feel you have to jump in on a new title on Day One.

Yup, not including MS releases on Game Pass, Elden Ring was the last AAA game I played at launch and before that ... hmmm, maybe Dark Souls 3? [edit] Oh wait, Cyberpunk 2077.

There are far far too many good AA and indie games for me to even think about most AAA games and dealing with any launch issues that they might have.

Warhammer 40k: Space Marine 2 is likely the next AAA that I'll play at launch, just because I love all things Warhammer and Warhammer 40k. And I'll be crossing my fingers that it's not in rough shape at launch.

Regards,
SB
 
So I made some frame captures comparing the Uncharted collection against TLOU. These games don't work in RenderDoc so I used nSight instead. Look on the upper right and you'll see number of textures and buffers in each scene.

Uncharted LL = 7317 textures and 336 buffers
Uncharted LL = 6986 textures and 370 buffers
TLOU = 17385 textures and 879 buffers

In both cases, the largest textures (outside of render targets) are 2048x2048 but with the vast majority being 512x512 and 1024x1024. The reason that TLOU uses more VRAM doesn't seem caused by having higher resolution textures and buffers but by having far more of them.
 
So I made some frame captures comparing the Uncharted collection against TLOU. These games don't work in RenderDoc so I used nSight instead. Look on the upper right and you'll see number of textures and buffers in each scene.

Uncharted LL = 7317 textures and 336 buffers
Uncharted LL = 6986 textures and 370 buffers
TLOU = 17385 textures and 879 buffers

In both cases, the largest textures (outside of render targets) are 2048x2048 but with the vast majority being 512x512 and 1024x1024. The reason that TLOU uses more VRAM doesn't seem caused by having higher resolution textures and buffers but by having far more of them.
Nice work. And no surprises there. This is what others and me have being saying about this port. Most games use relatively low amount of vram because they are obviously repeating many of them whereas the vast majority of textures in TLOU are unique.

On PS5 this is not a big problem as those textures can be quickly and painlessly streamed in by the system directly into vram and ready to use. The whole process being obviously very different and taking a lot of CPU / bandwidth ressources on current PCs, or you need big amount of vram in order to store textures of the whole scene.
 
Nice work. And no surprises there. This is what others and me have being saying about this port. Most games use relatively low amount of vram because they are obviously repeating many of them whereas the vast majority of textures in TLOU are unique.

On PS5 this is not a big problem as those textures can be quickly and painlessly streamed in by the system directly into vram and ready to use. The whole process being obviously very different and taking a lot of CPU / bandwidth ressources on current PCs, or you need big amount of vram in order to store textures of the whole scene.
I agree, I don't think there's ever been anything particularly wrong with it. It's mainly just VRAM intensive as a result of being a PS5 port and slightly CPU intensive from the lack of decompression logic.

This is still obviously just preliminary. But I'm not sure how you would 'optimize' this game better without just removing textures from it. These textures aren't that high res to begin with and lowering their resolutions will probably always show.
 
So I made some frame captures comparing the Uncharted collection against TLOU. These games don't work in RenderDoc so I used nSight instead. Look on the upper right and you'll see number of textures and buffers in each scene.

Uncharted LL = 7317 textures and 336 buffers
Uncharted LL = 6986 textures and 370 buffers
TLOU = 17385 textures and 879 buffers

In both cases, the largest textures (outside of render targets) are 2048x2048 but with the vast majority being 512x512 and 1024x1024. The reason that TLOU uses more VRAM doesn't seem caused by having higher resolution textures and buffers but by having far more of them.
what's the approximate size of the streaming pool if that type of information is available?
 
what's the approximate size of the streaming pool if that type of information is available?
I honestly don't know. These captures don't really provide that information since they just report what's in the visible frame. Backing stores containing unused textures don't show up as far as I know.

I'd have to check if there's a difference between Ultra and High texture settings but I never noticed a visible difference between them when I was playing the game. My bet would be that the setting is mainly controlling the size of the texture pool rather than changing the actual size of the textures but I'll need to check for that.
 
Getting annoyed with this stuff tbh... system requirements for games suggesting DLSS.. and not only that.. but DLSS "performance" mode...

di584svso2wa1.jpg



Devs are clearly just using DLSS/FSR as a crutch now to release unoptimized ports... and it's going to get even worse when FSR3 releases and then they start recommending that..
 
Getting annoyed with this stuff tbh... system requirements for games suggesting DLSS.. and not only that.. but DLSS "performance" mode...

di584svso2wa1.jpg



Devs are clearly just using DLSS/FSR as a crutch now to release unoptimized ports... and it's going to get even worse when FSR3 releases and then they start recommending that..

I would take a look at the video clips for this game though. It may be LOTR licensed, but otherwise it really doesn't look like this is going to be remembered a few months after its release.

It's releasing on PS4 and Switch too. I just wouldn't draw any conclusions really from this wrt any future trends for optimization, I mean sure we have enough worrying signs already but this really just looks to be a development studio perhaps out of their depth or just a title receiving any attention at all due to the licensing.

1682484118892.png
I mean, come on.
 
Getting annoyed with this stuff tbh... system requirements for games suggesting DLSS.. and not only that.. but DLSS "performance" mode...

di584svso2wa1.jpg



Devs are clearly just using DLSS/FSR as a crutch now to release unoptimized ports... and it's going to get even worse when FSR3 releases and then they start recommending that..

Ha, too late. Did you see the DLSS3 footnote?
 
Is TLOU already doing any sort of texture streaming on PC? Otherwise SFS won’t help.

That would mean to implement streaming too. Not saying it would be simple, just saying what could maybe bring improvements.
 
That would mean to implement streaming too. Not saying it would be simple, just saying what could maybe bring improvements.

Ok but if they aren’t streaming at all then implementing streaming even without SFS probably helps a ton. SFS is just icing on the cake.
 
Devs are clearly just using DLSS/FSR as a crutch now to release unoptimized ports... and it's going to get even worse when FSR3 releases and then they start recommending that..
People have made claims like this for years. Every new GPU generation that comes out, I always hear at least a handful of people claiming that this extra power will just mean developers will stop optimizing their games.

It's never made any sense. And this incessant insistence by gamers that developers are all just lazy and dont care about optimizing is getting out of hand.

Beyond that, remember that these 'requirements' are never accurate. Stop taking them so seriously. They are rarely more than just rough guesses.

Lastly, the idea of using reconstruction as default *was* going to happen at some point. I've been saying this for years now and I'm shocked how many people dont grasp this. As these reconstruction techniques have gotten so good nowadays, it's more often than not wasteful to use native resolution instead, when you can a near enough result with much better performance. Any downsides will typically be seen as acceptable in order to achieve that performance overhead.

So yes, ANYTHING that grants more performance overhead can theoretically allow developers to slack off, but there's still always high incentive to optimize as much as reasonably possible. This has always been the case and will always be so.
 
Perhaps implement SFS?

This was my though too. Streaming with SFS would obviously reduce the VRAM requirement and wouldn't necessarily even require a fast SSD if you were to use system RAM as the streaming source (obviously pre-loading what's needed for the level in there first and streaming in from disk at a slower pace to update that pool).

And then the CPU overhead related to the decompression of those streamed textures could be dealt with by Direct Storage 1.1.

I know implementing these systems in what is a straight port of a game built around the console architecture would likely have been more work than it's worth, but the key point is that there are solutions to these problems on the PC and this isn't a straight forward limitation of the PC architecture when dealing with a console centric game. It's more a case of an optimal implementation vs a sub optimal port of the original implementation.
 
eo8t5z.JPG



Fiddling with graphics options didn't help much, either. There's no native option to limit the framerate, but the Nvidia control panel did the trick. The only upscaling option is FSR 2.0, which usually failed to improve my fps but always succeeded in making Cal's face blurry and unintelligible in motion. I've never missed DLSS more.

For what it's worth, there's a pre-release patch coming a few days before launch, and among the patch notes EA shared with press is "performance improvements across all platforms." Hopefully it'll help, but I'd be surprised if all of those framerate drops disappeared overnight. We are living in a time of bad PC ports (opens in new tab), after all.

 
Status
Not open for further replies.
Back
Top