Current Generation Games Analysis Technical Discussion [2024] [XBSX|S, PS5, PC]

That's certainly interesting but it's usefulness is unfortunately limited if its only saving system memory which is pretty much always in surplus on a half decent gaming PC. If it were saving VRAM that would be pretty significant.

That said, the fact that it's saving system RAM rather than VRAM is quite revealing wrt how this game at least is using the PCs memory hierarchy for streaming. It seems system RAM does indeed act as a cache between VRAM and storage and will be utilised to mitigate slower storage speeds.
 
Previously direct storage on PC was doing minimal jobs, basically offering a new task queue for command submitting on CPU. I believe GPU LZ4 decompression was only recently introduced to PC, and I wonder what's the compatibility rn. Until then I don't think direct storage matters a lot on PC.
 
Previously direct storage on PC was doing minimal jobs, basically offering a new task queue for command submitting on CPU. I believe GPU LZ4 decompression was only recently introduced to PC, and I wonder what's the compatibility rn. Until then I don't think direct storage matters a lot on PC.
Did some quick test. Currently the AMD support for GPU decompression is still incomplete and full of bugs. Tested the 7000 series, so I wouldn't assume ealier ones can utilize it as well. Don't think direct storage on PC is worth investing too many efforts.
 

I was curious if the use of mesh shaders would show up in any way in the power consumption numbers of the consoles. Versus each other and versus previous titles.
But the results here are pretty consistent with historical trends for each comparison.
I asked Antony if he was going to do Alan Wake 2, but his resources are limited. I'd really like to see that one, since it shows a bit more of a performance gulf.
In any case, visit his page and give the video a like.
 

I was curious if the use of mesh shaders would show up in any way in the power consumption numbers of the consoles. Versus each other and versus previous titles.
But the results here are pretty consistent with historical trends for each comparison.
I asked Antony if he was going to do Alan Wake 2, but his resources are limited. I'd really like to see that one, since it shows a bit more of a performance gulf.
In any case, visit his page and give the video a like.

Those power consumption numbers for that PC are insane.

My 4070ti is roughly 3090ti in performance and with my Ryzen 5 7600 at 5.35Ghz and my 4070ti at 3Ghz I get just under 300w power consumption.
 
Last edited:
Thought people might be interested in this. Got a few captures of Spiderman 2 with the patch that mistakenly enabled the debug menu (already patched out). One of the features was a performance overlay, showing GPU/CPU frametimes, dynamic res, fps ranges and stutters (dropped frames).

Someone already asked in the comments btw if this was regular performance, and from my recollection yes, meaning I don't believe this debug menu degraded performance in any way. Pay more attention to the frametime graph and not whatever youtube/chrome is doing to the frame pacing.

Opening 30 minutes, Performance mode:


Cruising around city:


Streaming stressed:


Quality Mode:

 
Last edited:
Interestingly, the config file for The Last of Us Part I has this:

"SSRReflectionQuality": "1.000000",
"RTReflectionQuality": "4",

I assume RT means Real Time and not Ray Traced, correct? This game doesn't have ray-traced reflections as far as I'm aware.
 
Thought people might be interested in this. Got a few captures of Spiderman 2 with the patch that mistakenly enabled the debug menu (already patched out). One of the features was a performance overlay, showing GPU/CPU frametimes, dynamic res, fps ranges and stutters (dropped frames).

Someone already asked in the comments btw if this was regular performance, and from my recollection yes, meaning I don't believe this debug menu degraded performance in any way. Pay more attention to the frametime graph and not whatever youtube/chrome is doing to the frame pacing.
Nice. I was wondering if we'd get any insight out of that slip up.

Wish there would have been storage bandwidth measurements as well lol.
 
Thought people might be interested in this. Got a few captures of Spiderman 2 with the patch that mistakenly enabled the debug menu (already patched out). One of the features was a performance overlay, showing GPU/CPU frametimes, dynamic res, fps ranges and stutters (dropped frames).

Someone already asked in the comments btw if this was regular performance, and from my recollection yes, meaning I don't believe this debug menu degraded performance in any way. Pay more attention to the frametime graph and not whatever youtube/chrome is doing to the frame pacing.

Opening 30 minutes, Performance mode:


Cruising around city:


Streaming stressed:


Quality Mode:

Almost constantly GPU limited (DRS working properly with usually ~1000p in performance mode with RT reflections as we already knew it), CPU doing not much in cutscenes and CPU working more in I/O heavy scenes. But interestingly it still rarely fully CPU limited even in those moments at 60fps (dedicated I/O working properly here, situation is very different on PC as on the fly decompression has to be done on CPU). Obviously the CPU is mostly 'sleeping' in most cases at 30fps.

I'd say they could be fully CPU and GPU limited at ~80fps when uncapped. Have you tried it?
 
Almost constantly GPU limited (DRS working properly with usually ~1000p in performance mode with RT reflections as we already knew it), CPU doing not much in cutscenes and CPU working more in I/O heavy scenes. But interestingly it still rarely fully CPU limited even in those moments at 60fps (dedicated I/O working properly here, situation is very different on PC as on the fly decompression has to be done on CPU). Obviously the CPU is mostly 'sleeping' in most cases at 30fps.

Relative to its 33.3 ms window yes, but it is interesting how the usage remains the same relative to performance mode ~14-16ms often like performance, but with a half frame rate target. They clearly took that extra time to increase the CPU load and it shows, LOD's are significantly increased for some objects, well beyond what you expect for the substantial res increase. If I could I would probably replay this game in the 40hz mode, Quality mode just looks worlds better.

I'd say they could be fully CPU and GPU limited at ~80fps when uncapped. Have you tried it?

Don't have a 60+ hz display atm. Regardless the patch has already been fixed (literally booted it up the next the day thinking "oh shit, better disable automatic updates" and then got the notif the patch was applied, fuck!).
 
Last edited:
But interestingly it still rarely fully CPU limited even in those moments at 60fps (dedicated I/O working properly here, situation is very different on PC as on the fly decompression has to be done on CPU).

The game isn't on PC (that pirated version doesn't count). I fully expect the actual PC release to feature GPU decompression. If its anything like Spiderman or Ratchet & Clank though, the CPU requirements could well be modest anyway.
 
Interestingly, the config file for The Last of Us Part I has this:

"SSRReflectionQuality": "1.000000",
"RTReflectionQuality": "4",

I assume RT means Real Time and not Ray Traced, correct? This game doesn't have ray-traced reflections as far as I'm aware.

It does have raytraced reflections, of capsule man! Capsule man is made of basic geometric shapes, same capsules generally used for collision, then in Last of Us (PS3 version) adapted for ambient occlusion (sorta), then in Last of Us Pt 2 adapted for a hacky realtime RT reflections as well. Other games have done similar after.

I kinda expect these to make a comeback sometime this gen. Capsule man needs albedo for better reflections but that could be accomplished with nothing more than like, a single spherical harmonic (or similar) term per capsule. If you combine say, a bvh, signed distance fields for static geo in the BVH, and switch to tapered capsules which are just capsules but with one end smaller than another optionally (better for collision anyway) you get an RT structure that can hit a ton of rays on current consoles. 60fps and a lot of animated characters would be possible, and as long as you don't have clean window reflections or something players might not be able to tell. Would be useful for more fantasy like settings anyway.
 
This game seems to have very serious problems with screen tearing on console in the full release.

(cued to relevant section)

I assume John's DF video from today only showed the PC version.
 
Back
Top