Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

The 4090 is several times more powerful than the PS5's GPU and the 5800X3D is far above the PS5's CPU. The fact that the 4090 drops to the 40's is not even comparable. Furthermore, what setting below Very Low are you referring to?

It was a 5800X, not an X3D. Still more powerful than the PS5 of course but without matched settings then I'm not sure what the value of trying to draw conclusions is at this point.

That game with or without RT eats even beefy CPUs for breakfast even at PS5 settings whereas the console chugs along just fine.

How are you defining chugs along fine? As above, until we see matched settings (or as close to matched as possible) in the same areas I don't see how you're reaching this conclusion.
 
Crod density on PS5 is below Very Low on PC - that OG Video Had the settings on PC for crowd density not functuoning.
In Miles Morales, Spider-Man, or both?
A Ryzen 5 3600 with RT on can get 60 FPS and above in tiems squared with my optimised settings. If your 9900K gets worse than that, I would be shocked.
I wouldn't.


Notice that my GPU remains in the 60% range. Disabling HT funnily enough improves performance. This has been reported by many people since the first day but it doesn't seem like much has been done about this.

It was a 5800X, not an X3D. Still more powerful than the PS5 of course but without matched settings then I'm not sure what the value of trying to draw conclusions is at this point.



How are you defining chugs along fine? As above, until we see matched settings (or as close to matched as possible) in the same areas I don't see how you're reaching this conclusion.
Even with matched settings in the original game, the demands on the CPU are insane.
 
In Miles Morales, Spider-Man, or both?

I wouldn't.


Notice that my GPU remains in the 60% range. Disabling HT funnily enough improves performance. This has been reported by many people since the first day but it doesn't seem like much has been done about this.


Even with matched settings in the original game, the demands on the CPU are insane.
Surely you can see the difference in your video as to why the game runs as it does on your set up - you are ultra wide + extended FOV. Extending the game FOV is drawing way more objects as view frustum culling is much less effective with the much higher FOV. This game also draws more objects in RT view at higher quality when with in view frustum (pretty easy to test by looking at the quality of objects in screen-space vs. out of screen space)
What about 16:9, normal standard fov, with pedestrian density set to very low? I assume it runs much better.
 
I never lined that setting up or checked it - I just checked Fidelity 30 fps Mode and Performance RT Mode with a 60 FPS Cap.
there are no diff in graphics settings in 40fps mode uncapped vs 30 (only probably slight diff in dynamic res), btw why using to comparison cap modes when there are uncapped
 
re are no diff in graphics settings in 40fps mode uncapped vs 30 (only probably slight diff in dynamic res), btw why using to comparison cap modes when there are uncapped
I never compared the game's GPU performance to PS5 as -
1 - you cannot get the Same settings on PC as console (Hair, depth of field, or Traffic cannot at all be lined up and someone is lying If they say they have PS5 setting exactly).
2 - Nixxes told me the Dynamic Resolution works differently on PC which will make the comparison useless anyway.
 
5600X matching PS5 cpu seems in-line with what one would expect, heck the 5600x should be more capable. Perhaps not in ports of native PS games, however. Ports are useless to gauge performance against other platforms.
 
Ultrawide



They're high. Higher than we would expect for a similar CPU to the PS5, likely largely due to the CPU based decompression that the PS5 doesn't have to deal with. But I wouldn't say insane.
Does CPU compression block in the PS5 really help to such a degree? That's pretty impressive
 
This one performs similarly on my aging 2700x and VRAM buckled 3070.
Native 1440p with both reflections set to very high, I get similar, or maybe a tad bit better performance compared to base game.

Reflection geometry at high gives enough headroom for a more healthy 60 FPS lock but both seems playable and acceptable. DLSS Quality at 1440p would be more useful if I had a 5800x or something. Otherwise, it just frees GPU resources for higher settings, maybe, ray traced shadows. But then I run into more memory problems, so nah.

Despite traffic and crowd density set to very low, streets are full of NPCs and cars so I have no idea how that setting functions. I guess developers stopped respecting the semantics of certain words. Wish they didn't.

High reflection geometry does not fit the name of "High" with hideous texture-like reflections, whereas Low crwod density does not fit the name of "Low" with bustling streets. My only gripe is this but this is the case for a lot of games recently so yeah.

"Very low" crowd density

mGAyhnf.png



"High" the almighty "high" reflection geometry

fj34yGJ.png






This would be a good start for people who are preparing these settings presets, I guess.
 
High reflection geometry does not fit the name of "High" with hideous texture-like reflections, whereas Low crowd density does not fit the name of "Low" with bustling streets. My only gripe is this but this is the case for a lot of games recently so yeah.

"Very low" crowd density
It is indeed all relative - the "very low crow density" in this game is higher than literally almost any other open world game lol. While the "high" RT reflections are... uh... yeah really poor beyond their extreme distance (multiple KM away).
Do not forget, of course, to restart the save or reload when setting crowd/traffic density. Those are the only two options the game has that require full save restarts to actually show.

Does CPU compression block in the PS5 really help to such a degree? That's pretty impressive
In my interview I believed they mentioned it saturating one core at least on PC(of Zen 2 quality?).
 
It is indeed all relative - the "very low crow density" in this game is higher than literally almost any other open world game lol. While the "high" RT reflections are... uh... yeah really poor beyond their extreme distance (multiple KM away).
Do not forget, of course, to restart the save or reload when setting crowd/traffic density. Those are the only two options the game has that require full save restarts to actually show.


In my interview I believed they mentioned it saturating one core at least on PC(of Zen 2 quality?).
Yeah I always have to restart in this game. For example I was getting 45 FPS or so at native 1440p. I was like, lets enable DLSS and what happens. Lo and behold, I now get 35 FPS for some mystical reason. Restart and now it works like it should, at 60+ FPS. Game has some problems on that front. :oops:
 
It is indeed all relative - the "very low crow density" in this game is higher than literally almost any other open world game lol. While the "high" RT reflections are... uh... yeah really poor beyond their extreme distance (multiple KM away).
Do not forget, of course, to restart the save or reload when setting crowd/traffic density. Those are the only two options the game has that require full save restarts to actually show.


In my interview I believed they mentioned it saturating one core at least on PC(of Zen 2 quality?).

During Spiderman post mortem, they said the engine architecture on CPU side is using main + render thread. They use all the core but this is not very efficient because they will reach a single threaded bottleneck probably the main thread. This is the reason it runs much better on CPU with very high frequency like 5.8 Ghz like some Intel CPU. It is better to have ECS architecture like Tiger Engine of Bungie or Decima Engine or job based architecture like Bluepoint or Naughty Dog. They said they were thinking to evolve this part of the engine. I hope it will be there for Spiderman 2.

EDIT: Insomniac is focus on release date. They will never integrate a feature if it means the game will not reach the year they want to release. With Spiderman 2 I have some hope because this will be the first time they have 5 years to develop a game. It will probably help them to rework the engine.
 
Last edited:
Is there a reason we ignore that consoles use APUs and unified memory?

The gist of the discussion seems to revolve around the ideal that the cpu on a PC is doing more work or working more inefficiently than it’s console brethren. Thereby the gpu is forced to wait on the cpu and is often underutilized.

Could it be that consoles do a better job of keeping their gpu fed because their memory systems are more performant? And that’s it’s no fault of the developers because it’s hard to optimize around that reality?
 
Last edited:
Surely you can see the difference in your video as to why the game runs as it does on your set up - you are ultra wide + extended FOV. Extending the game FOV is drawing way more objects as view frustum culling is much less effective with the much higher FOV. This game also draws more objects in RT view at higher quality when with in view frustum (pretty easy to test by looking at the quality of objects in screen-space vs. out of screen space)
What about 16:9, normal standard fov, with pedestrian density set to very low? I assume it runs much better.
Ultrawide



They're high. Higher than we would expect for a similar CPU to the PS5, likely largely due to the CPU based decompression that the PS5 doesn't have to deal with. But I wouldn't say insane.
It was actually recorded at 1440p even with the black bars but it seems Geforce Experience or Youtube just puts the video in the aspect ratio of the monitor or more than likely the aspect ratio of the OS resolution at the time of the upload. Maybe the NVIDIA capture tool just ignores the OS settings. Not sure. Whatever the case, as I play, it’s definitely 16:9. I made sure to also change the monitor resolution to 2560x1440 and I even set the aspect ratio to 16:9 here:


Identical performance. Also, changing those settings usually has almost no impact in other games. Usually narrowing or widening the FOV has virtually no impact in my games, nor does manually setting the display resolution to 16:9 rather than just in game.
 
5600X matching PS5 cpu seems in-line with what one would expect, heck the 5600x should be more capable. Perhaps not in ports of native PS games, however. Ports are useless to gauge performance against other platforms.

This is normal. PS5 CPU does less stuff for example in RT mode the static part of the BVH is streamed. I don't expect Spiderman Miles Morales to push the SSD but in future title it will cost more on a PC CPU than PS5 CPU to decompress CPU data and manage memory to check in and other stuff.

And it is the same for XSX and XSS CPU.
 
This is normal. PS5 CPU does less stuff for example in RT mode the static part of the BVH is streamed. I don't expect Spiderman Miles Morales to push the SSD but in future title it will cost more on a PC CPU than PS5 CPU to decompress CPU data and manage memory to check in and other stuff.

And it is the same for XSX and XSS CPU.

Any sources for those claims?
 
Any sources for those claims?
This is a hyperbole btw. Some people keep repeating over and over again how Spiderman is special and how it uses CPU to create BHVs and so on. Almost ALL ray tracing games so far utilize CPU heavily, most likely for BHV creation. There are certain factions

1) NVIDIA faction: "The game is not utilizing RTX GPU properly!!!" - No relation: Almost all ray tracing games so far utilized CPU heavily like Spiderman did, and most likely for similar reasons
2) PS5 faction: "PS5 is creating them with special hardware, you have to do it on CPU on PC, etc. etc." - It is quite possible PS5 may have some special trick up in its sleeve. I can't deny or agree with this statement.

1st argument is used by NVIDIA fans to speak badly of the game, despite not understanding or realizing that since the arrival of ray tracing to PC games, it has been super CPU bound, and almost always created an extra %15 to %30 CPU boundness. It is no different in Spiderman, it will simply take %25-30 more CPU power, just like how it happens in Cyberpunk, GOTG and many other Ray Tracing titles I've PERSONALLY tested the CPU impact of.

Some people see how CPU bound Spiderman is and naturally think it is evil caused by ray tracing and Sony/Imsomniac purposefully gimping PCs to not utilize RTX hardware. I really don't think it is the case. I'm sure HW accerelated Ray Tracng still plays a part in BHV structures, but it is quite evident that almost all ray tracing games so far have increased CPU demands. On top of BHV stuff, Alex also stated that extra objects drawn / rendered in reflections also pile up on CPU as well.

 
Back
Top