Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
There has been one. Can you guess which? Uncharted 4. The 2080 Ti is only a smidge faster than the PS5 overall in that game despite being consistently 40-50% in other games.
IIRC Death Stranding also performs very closely to 3060ti. Seems like PS centric games simply like the hardware. Maybe they exploit a specific strength of PS5 that shifts the balance towards there. Just a guess IMO.

I have no qualms with PS5 matching my 3070 or 3060ti.
 
IIRC Death Stranding also performs very closely to 3060ti. Seems like PS centric games simply like the hardware. Maybe they exploit a specific strength of PS5 that shifts the balance towards there. Just a guess IMO.

I have no qualms with PS5 matching my 3070 or 3060ti.
A 3060 Ti is fine but barring a VRAM limitation, the PS5 has no business matching a 3070.
 
Game scales well with high core counts (observable in charts). It even seem to scale from 6 cores to 8 cores. My 2700 at 3.7 GHz was pushing mostly around 55- 70 FPS in most scenes so far, with VRR it is very smooth. GPU is the problematic side on my end. Nothing to write home about but I'm near 3600x performance from the looks of it. I'd have to test their scenes specifically though.

(Sorry for blurry 1080p resolution pictures, I had to drop to 1080p to become CPU bound instead of GPU bound on my 3070 XD Game looks insanely blurry at 1080p )

VpbUULv.png



4qrWCQ8.png

TNKwx4z.png



( I dunno where they tested it in gamegpu website. Would like to know the location )

Its one of the first games I've seen to use all 16 threads of my CPU. Thankfully VRR/framecap saves the day with frametime instabilities. Need a better CPU to go full unlocked, but I'm happy with what I got

Edit: Okay, they seem to test the last pic location as I suspected. So my 3.7 GHz Zen+ CPU performs near 4 GHz 3600 most likely. 4 GHz would put me near 3600x most likely. (58 FPS to 63 FPS) I'd say that location may not be representative for the whole game performance, later sections are more calm and corridor like. Most likely if you get 50-60 FPS there, you will get 70+ FPS in the majority of the game. So I don't think CPUs here will be much of a problem.
 
I am really impressed how much of a CPU this game can utilise, but I am not really understanding why and what it is doing to warrant that CPU utilisation and performance.
Like this alley way, why is this alley way doing this to a Ryzen 5 3600? What processing is occuring to render this little corner?
first_play.00_24_39_2qbd14.png


@yamaci17 what is your in-game VRAM meter saying about OS reservation? 1600 mb?
 
This is a stupid question I will look stupid asking it but here goes. I thought 4090 was 8 times the general GPU power of the PS5 and not 4 times? I know flops are not a real measurement and all, but considering how much power that component requires among other things I could have sworn it was much stronger than just 4x
Only in floating point math which accounts for a small portion of the total rendering time of a typical frame. From what users have posted with Nvidia’s profiling tools, I haven’t seen a frame where it was able to reach full utilization.
 
I am really impressed how much of a CPU this game can utilise, but I am not really understanding why and what it is doing to warrant that CPU utilisation and performance.
Like this alley way, why is this alley way doing this to a Ryzen 5 3600? What processing is occuring to render this little corner?
first_play.00_24_39_2qbd14.png


@yamaci17 what is your in-game VRAM meter saying about OS reservation? 1600 mb?
Yes, it says 1600 MB for me as well. But the game will happily use near 7500 mb if I have free unused VRAM space (I've managed to push dedicated VRAM usage to 7600 mb at one occasion). I have to take back what I said in that matter, this game definitely uses more VRAM than Spider-man or most other recent titles; so there are no problems on that front. That meter is still weird though. Also, game utilizes a huge amount of shared memory similar to Spider-man but it doesn't impact the performance the way I would think it is. The game definitely does not limit itself to the "game application" value which I'm grateful. (However I confirmed that OS+Background is a fixed %20 percentage of your total VRAM. I've seen my friend's 4090 reporting 4.8 gb for OS+background whereas only... 700 mb was actively used in total. It is clearly a %20 fixed percentage)

What is weird with CPU usage is that even if you set everything to low, it practically changes nothing. Ultra to low seem to have similar CPU bound performance, despite a lot of settings claiming to have minor/moderate impact on CPU performance. I hope they push some settings to scale CPU performance somehow.

1680101407002.png

This game is crazy about shared memory usage. It will chomp a lot of it even if you have free VRAM. Weird stuff. Could be why there are stutters here and there though. Game swaps to normal memory all the time maybe? But I've managed to quell these stutters by limiting the framerate. Maybe my PCIE 3 is being overrun by a lot of data transfer and capping the framerate eases the load there? I don't have a clue. So far I've found solace with in-game 50 FPS lock and I'm enjoying the game.
 
I am really impressed how much of a CPU this game can utilise, but I am not really understanding why and what it is doing to warrant that CPU utilisation and performance.
Like this alley way, why is this alley way doing this to a Ryzen 5 3600? What processing is occuring to render this little corner?
first_play.00_24_39_2qbd14.png


@yamaci17 what is your in-game VRAM meter saying about OS reservation? 1600 mb?
I think it’s from an earlier GDC presentation that they use atomic threads to build work queues so that all cores are fully utilized, they are constantly doing and putting more work back onto the stack.

But I understand that utilization is not the same as work being done. It just means it’s being used. Wattage may be a better understanding of work being done if the cpu utilization sits at 100
 
I am really impressed how much of a CPU this game can utilise, but I am not really understanding why and what it is doing to warrant that CPU utilisation and performance.
Like this alley way, why is this alley way doing this to a Ryzen 5 3600? What processing is occuring to render this little corner?
first_play.00_24_39_2qbd14.png


@yamaci17 what is your in-game VRAM meter saying about OS reservation? 1600 mb?

Here's my overclocked i5 12400f in the same spot on Ultra settings (And still CPU limited)

Alley.png
 
The game is probably compiling shaders in the background as you play despite having waited for the compilation to finish at the beginning.
 
The game is probably compiling shaders in the background as you play despite having waited for the compilation to finish at the beginning.
I was still getting that much usage more than 1h into the game though.

I did find interesting the gamegpu results I saw yesterday. On the same settings, the 3070 was using, on 1080p and 1440p, 22 to 24gb of ram, and the 6600 xt, 17-19gb, which kinda lines up with my own tests using the 6600M at 1440p, even though I was using the high preset + FSR quality, always around 19-20gb of ram used, and full vram.

FsYyxYTWAAAPiVG


I also recorded my very first hour of this using the same settings, 1440p + FSR quality @High preset. Very weird stuff happens, like 2 or 3 instances of a big "please wait" loading screen, that ranged from 30s to a bit more than 1 minute, the first loading screen that took more than 5m (not recorded), the first death load screen, that took quite a bit, mouse stutter, the shader comp, that at 1 hour or so, was still at 32% (rebooted the PC, ran the game again, and it flew to 100% ins less than 15m lol) and a few other weirdness (I have another small video from a cutscene, where all characters were soaked in water), but I honestly was expecting way worse. Game also wants the newest AMD driver installed, even though I'm using last month's driver.
 
There has been one. Can you guess which? Uncharted 4. The 2080 Ti is only a smidge faster than the PS5 overall in that game despite being consistently 40-50% in other games.

I don't think that was anywhere near this bad though. It ran fine with 8GB as far as I've seen and the 2080Ti is still faster in that came. By a decent margin at 4K as well. Whereas here is actually seems quite a bit slower. Which to me isn't just poor optimization, it's flat out broken.
 
I don't think that was anywhere near this bad though. It ran fine with 8GB as far as I've seen and the 2080Ti is still faster in that came. By a decent margin at 4K as well. Whereas here is actually seems quite a bit slower. Which to me isn't just poor optimization, it's flat out broken.

A 2080ti trades blows with a 3060ti and a 3060ti isn't really that much faster than a 6600XT or 6700.

GPU's that in pure raster workloads are the level that PS5 performs at.

So is a 2080ti that much better in raster?
 
A 2080ti trades blows with a 3060ti and a 3060ti isn't really that much faster than a 6600XT or 6700.

GPU's that in pure raster workloads are the level that PS5 performs at.

So is a 2080ti that much better in raster?

3060Ti is marginally faster than the 2080 Super but the 2080Ti is a good 10% faster according to TPU. Only in RT corner cases would the 3060Ti be matching or beating the 2080Ti.

Again according to TPU the 3060Ti is 12% faster than the 6600XT in raster and the 2080Ti is 23% faster.
 
3060Ti is marginally faster than the 2080 Super but the 2080Ti is a good 10% faster according to TPU. Only in RT corner cases would the 3060Ti be matching or beating the 2080Ti.

Again according to TPU the 3060Ti is 12% faster than the 6600XT in raster and the 2080Ti is 23% faster.
The 3060 Ti doesn't match the 2080 Ti under any circumstances. The 2080 Ti and 3070 are almost always dead even. They're both about 15% faster than the 3060 Ti in general.
 
3060Ti is marginally faster than the 2080 Super but the 2080Ti is a good 10% faster according to TPU. Only in RT corner cases would the 3060Ti be matching or beating the 2080Ti.

Again according to TPU the 3060Ti is 12% faster than the 6600XT in raster and the 2080Ti is 23% faster.

From looking at newer games the 3060ti is much closer to the 2080ti.

But let's go with your figure, is 23% really going to make such a big difference?

It's not going to enable going from 30fps to 60fps and it won't enable going from 1440p to 4k.

At best it'll stabilise a frame rate or allow a few extra quality settings to be turned up a notch.
 
Status
Not open for further replies.
Back
Top