Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

Yeah, this might be a good game to look at when Zen 4 and DDR5 land, see if performance gains going to that differ particularly from other games.

Do we know if Spiderman is using AVX2? 256-bit operations seem to be pretty demanding in terms of BW on PC, though on PS5 the situation would be a bit difference as it's got half width vector operations.
AC Odyssey is another one. Both these games peak out at ~32GB/s maximum and average just over 28GB/s when running uncapped on my system. My system has a max bandwidth of ~35GB/s when performing both reads and writes as reported by Intel's Memory Latency Checker. And latencies grow massively as you approach the limits of your bandwidth.

Also, look at the 'DRAM bandwidth bound' metric in my screenshot above. I messed up by not including it in the non-RT shot but it reports 0% vs 97.9%.
 
I have repeatedly said that 2700x trades blows with 4700s benches. I was the first person to say that, when you denied it and falsely claimed that the 2700x was greatly ahead of the 4700s based on benches you clearly were unable to understand. I actually provided the benches that showed this.

You repeatedly falsely represent what I said, no matter the references and quotes I provide about what we both said. No matter how many times.

Your dishonesty is clearly intentional at this point, as you have proved many times now.
So Im buffled, you said that 2700xt is bad choice comparing to ps5 as I thought you think its slower than 4700s and it turns out you know its generaly close hw representation and then you started to bring some shit posting on me :d Im not in your head so dont know what you think but for sure the way you comunicating your opinion is just pure confusion for me.
 
Are all the Intel CPUs using DDR5? Only the 12900k shows different scaling with RT.

Oddly enough they actually only specifically identify the 12900K and its motherboard as their test system, they don't identify the motherboards/ram for the other systems or actually how they got them. Also got the ram speed wrong, it was DDR-5400, not 4800.

computerbase said:
The graphics card benchmarks (to follow on Friday 12 August 2022) were run on an Intel Core i9-12900K (test) running at default settings . The ROG Asus Maximus Z690 Apex (BIOS 0702) with the Z690 chipset was installed as the mainboard, so graphics cards could be controlled with PCIe 4.0.
computerbase said:
The CPU was cooled by a Noctua NH-D15S with a centrally installed 140mm fan. 32 GB of memory (Corsair Vengeance, 2 × 16 GB, DDR5-5400, 40-40-40-84-2T) were available to the processor. Windows 11 21H2 with all updates was installed on an NVMe M.2 SSD with PCIe 4.0. The same was true for the game. Resizable BAR has been used on supported graphics cards from both AMD and Nvidia.
 
Is it normal for the CPU to be updating the BVH? I think this is done via GPGPU on consoles IIRC
TLAS updates happen on CPU, devs decide how often to update it.
BLAS building is a GPU thing hidden behind a driver so not that devs can do something with it beside choosing an update strategy - refit or rebuild.
Skinning can be done on either CPU or GPU. For low poly / LOD RT models, CPU might be preferable since there would not be likely enough of vertices to saturate a GPU (otherwise this stuff can be moved to async compute and computed in parallel with G-buffer fill).

Given the increased PCIE traffic with RT On in Spider-Man, it seems RT models skinning is done on CPU side there, so lots of additional mesh data have to be passed from CPU to GPUs each frame.
I guess the next optimisation for Nixxes would be moving that stuff to the async queue as all optimisation guides suggest.
 
Last edited:

6600XT ripping through spiderman at DF settings + high RT. Thats before the promised patches who improve performance alot on the CPU side of things.
 
Given the increased PCIE traffic with RT On in Spider-Man, it seems RT models skinning is done on CPU side there, so lots of additional mesh data have to be passed from CPU to GPUs each frame.
I guess the next optimisation for Nixxes would be moving that stuff to the async queue as all optimisation guides suggest.

Yup.

 
Interesting data HolySmoke, thanks. Computerbase.de's CPU benchmarks for Spiderman which show Intel greatly outpacing Ryzen in RT, when AMD was far more competitive without RT may be explained then by their Intel test system using DDR5-5400*, while he Ryzen systems of course used DDR4.

edit: ram speed was 5400mhz, not 4800.

One interesting thing is that the 3600 is quite a bit faster than the 10600k. However it would be interesting to find out if this is actually due to the CPU itself or the PCIe 4.0 vs 3.0 support.
 
FWIW, I’ve also tested PCIe bandwidth and that screenshot is misleading because it’s not working like that at all times. I mentioned in an earlier post that both HZD and SM show fairly high bus loads but it’s only in the 10-20% range.

If the game was actually sustaining such high PCIe bandwidths then it flat out wouldn’t be running at 60fps. It wouldn’t even be running at 10fps.
 
Was too curious so obtained (cough) Spiderman RM for the PC to just compare on a technical level to the PS5 version which I've prob played through 3 times by now. This is on a i5-12400 @ 4ghz, 3060, 16GB. Latest patch. Game is installed to an Intel NVME, 1.8GB/sec, so def a middling drive but should not be a bottleneck. So a relatively modern budget (?) PC.

Config: Everything on High, except Crowd Density and Hair at Medium, and Dof at Low - the target from the outset was a solid 60 4k at DLSS Quality, but dynamic res was engaged at some points. Playing on a 60hz tv.

So, thoughts:
  • 4k DLSS Quality for the most part is indeed 60fps at 4k running through the city with my settings, fighting and indoors are no issue easily, but some cutscenes can drop into the 50's.
  • On that "for the most part": Even without a high CPU/GPU load (my CPU is 40-50 swinging through the city without RT), there are periodic frametime spikes of varying degrees which no settings can help with, as it doesn't look like any particular component is the bottleneck. Some of these are the aforementioned issue DF mentioned when rapidly switching views that can stress the culling system, so as running up a building then using a web to slingshot 180 degrees, it can cause a drop to 58/59 fps. The more annoying one though is just a ~60+ ms spike that occurs periodically swinging around, no obvious cause. Potential shader compiling, but there's no large CPU spike and they can occur running through the same area well after they're first encountered. Seen others with middling rigs complain about this on reddit threads too.
I'm definitely somewhat of an outlier with PC gaming as I play mostly on a 60hz TV, so this kind of thing has always been why I'm sometimes wary of PC benchmarks and how they might apply to my setup, even those that include "1% lows". A long enough data capture, especially with an uncapped framerate, and these hitches would barely register - but they are certainly noticeable during gameplay, especially on a fixed refresh rate display. I want consistency there, and I can't quite get it now - the fact that uncapped I could have a ~80 fps average or even a 60+ 1% low means little as these hitches announce themselves pretty prominently on a 60hz display. These aren't like every few seconds mind you, I have can have 2+ minute sections without seeing them, but they're frequent enough to be distracting.

(BTW on that note, should be mentioned as well that not until recently did I actually look at the benchmark video that computerbase.de was using for their CPU benchmarks, and it's largely an indoor environment with just strolling through the city - on foot - for the external portion. So uh, keep that in mind considering traversing the city can be a more stressful test for certain portions of this engine.)
  • Dynamic res is interesting: It actually tries to maintain a ~95% GPU usage at all times, so it can actually end up producing a more detailed image than if it was disabled. I think this is how it works on the PS5 as well. For example, setting DLSS quality at 4K without dynamic res when the GPU load is ~75% will actually produce a slightly inferior image to having dynamic res enabled which is kinda neat!
....but, dynamic res, as noted by Alex, can also introduce some drops as it can't react quickly enough (can be a problem with some DR implementations on PC I've found). When it works it's great, can barely notice any res change when the performance is kept at 60, but it's definitely not as flawless as a dynamic res implementation should be for it to make sense. I mean you want it enabled to avoid those occasional load spikes, if it can't adjust quickly enough and you get stuttering then it somewhat defeats the purpose. As DF mentioned though I think this can be largely fixed by just being a little more aggressive for lowering res, Titanfall 2 on the PC for example allows you to set the dynamic res threshold to 1 fps increments. If you set it at 60 you'll get stuttering as it tries for too consistently high a GPU load and can't keep up when there's a big GPU load spike, but if you lock your framerate to 60 externally then set the game's dynamic cap to say, 65fps, it has no issue and scales perfectly without stutter.
  • DLSS is noticeably sharper than IGT (Insomniac's temporal) on the PS5 or using it on the PC, but I'm not sure how much that is due to just being having the sharpening cranked up vs. actually reconstructing more accurately. I believe distant detail is definitely more visible and 'complete' with it, but it also produces more specular aliasing at times.
One of the reasons I loved Insomniac's implementation so much on the PS5 (at least in non-RT mode) was that it struck a great balance of being sharp but also cohesive. It produces an extremely stable image with very little reconstruction artifacts. DLSS seems to veer too much into sharpening atm, which can look better in some spots and does appear 'higher res', but can produce more shimmering when swinging around - this was evident even in youtube videos for me. Even at DLSS Quality, there are also a few instances where screen space reflections were actually slightly more accurately reconstructed on the PS5, such as seeing a wire structure reflected in a puddle - you could see more detail in the reflection on the PS5, albeit this is very slight though.

So over the course of many different scenarios throughout the city, using DLSS over Insomniac's scaling is actually not as cut-and-dried as I thought it would be. I mean I like sharp! That's why I usually played in Performance mode on the PS5 over RT, but specular aliasing really bugs me as well.
  • Definitely some issues with periodically loading textures. Buildings in the distance will sometimes have a lower res mip, and some elements on cutscenes are obviously not loading their highest detail as they should. Relatively few occurrences but they do stand out.
So overall on the non-RT image quality front, I'd say it's a mixed bag on my rig. At first glance when you boot it up, it's "Wow, DLSS is so much sharper!", but when you visit more locations and jump back and forth with it on the PS5 it's not so clear-cut (at least without RT, more on that below). The exception though is with dynamic res enabled - if you have a more powerful GPU than mine and say, set a 60fps cap, DLSS would likely look much better. I can test this by setting a dynamic 30fps cap, and while it doesn't infinitely up-res (my fps was still ~45-50 with this low cap), it obviously is constructing DLSS from a higher res than 4K because it looked noticeably better than the PS5 and even native TAA 4k.

As for RT, I only played around with that mode for about 10 mins, but with dynamic res at DLSS Quality/Balanced it actually holds 60fps surprisingly well, albeit with the aforementioned hitches explained above that occur without RT too. My CPU load was ~60% with a cap of 60, albeit I didn't explore every area (such as the notoriously heavy TImes Square). Compared to the PS5 RT Performance, DLSS provides a significantly more detailed image, dynamic or balanced DLSS without dynamic res enabled. But again, a downside too: DLSS with RT can also produce noticeably more flickering with RT reflections on some surfaces such as wet roads, especially if you're using DLSS performance mode. Surprisingly even FSR or IGT Performance mode doesn't exhibit these artifacts to the same degree. Compared to the PS5 RT Performance mode, it's significantly sharper yes, but then with dynamic res or locking to DLSS performance to keep it around 60 you're getting even more specular aliasing too than the level which bugged me without RT. Still superior to the PS5 overall visually I think, but some drawbacks.

Overall, it's...ok? A lot of good quality of life features as mentioned by DF, such as being able to change nearly every graphic setting and have it reflected instantly (PS5 when switching to RT mode forces you to a checkpoint restart), loads very quickly, can look superior to the PS5 version on my rig. But the framerate consistency in both RT and without is the most noticeable to me, so I'd have to still give it to the PS5 right now, which is actually somewhat rare when comparing versions of the same game on my PC vs PS5 when the PC version supports DLSS (Death Stranding is about the only other due to the frame pacing issue on the PC).

It's good in many ways, but I'll revisit it for subsequent patches. Outside of the RT CPU load everyone knows, they just gotta fix those texture mip issues, get rid of those periodic hitches, and perhaps give a DLSS sharpening slider. If they can manage those it would be a clear choice and elevate it to an exceptional port.

(Edit: jfc what a long post I need a life)
 
Last edited:
I'm waiting on my new Ryzen combo to arrive but if anyone can test with different memory frequencies (And thus different levels of bandwidth) to see if the game is indeed RAM bandwidth bound on slightly older CPU's (Which given the game logic, BVH and decompression work it's doing it very well might be)

I'll test a Ryzen 5 3600 when mine arrives.
 
Love your post - but one thing to mention is specifically how DRS differs on console and PC based upon how I understand what Nixxes told me in an interview.
On Console, there is a list of pre-determined resolutions it switches to and from, you can see these easily in manual counts of the game. It goes from scales to 4K output with IGTI, but the list of internal resolutions it can actually run at, is limited: 1440p, to 1296pish, then to 1080p. On PC, it can scale arbitrarily fine grain to almost any in-between resolution. So on console, if the GPU could technically do 1512p at 60, it won't, as the list does not include that, so it stays at the next closest stable resolution: 1440p. This ensures a healthy headroom on the GPU. Then if it does need to drop res (e.g. because 1440p would dip below 60 fps) it drops to the next available on the list, which would be either 1296p or 1080p. This means drops have a significant percentage of overall total pixels: it drops in big chunks basically. Bigger dropped chunks of resolution means the framerate can recover rapidly as it drops a heavy amount of pixels per frame.

On PC, it is fine-grain, so no pre-determined list and per-frame it drops in a "custom" amount to account for the frame drop. Since it is fine grain, it scales to higher resolutions (in a moment where it can do 1512p for example, it would), but as a consequence the GPU is closer to being tapped out more often. As a result, the GPU does not have as much headroom to account for larger swings in how heavy a scene is. When it comes time to drop res though regardless it droups to the resolution that accounts for the measured frame-time loss... not from a list of resolutions. So it makes dropped resolution less obvious as it is more specific, granular, and technically, gradual.

This is a big issue for comparing PC to console and why I think the attempt is flawed until PC also has a DRS mode similar to how it is done on console. On PC, the GPU will be pegged near full utilisation more often with DRS on than on console as on PC it scales to an arbitrary res. It will not adjust as rapidly (or as coarsely) on PC as does on console as well, meaning moments of frame drops can occur multiple frames in a row as the adjustment occurs, vs. one coarse drop that you may see on console.

As a result though, PC will end up with better IQ at the cost of headroom for potential drops and responsiveness to those drops. I asked Nixxes though if they would consider just implementing the way DRS is done on PS5 as a different "aggressive" mode (as they actively changed DRS for PC so that it favours IQ in this granular way) and they said they were open to it.
 
Was too curious so obtained (cough) Spiderman RM for the PC to just compare on a technical level to the PS5 version which I've prob played through 3 times by now. This is on a i5-12400 @ 4ghz, 3060, 16GB. Latest patch. Game is installed to an Intel NVME, 1.8GB/sec, so def a middling drive but should not be a bottleneck. So a relatively modern budget (?) PC.

Config: Everything on High, except Crowd Density and Hair at Medium, and Dof at Low - the target from the outset was a solid 60 4k at DLSS Quality, but dynamic res was engaged at some points. Playing on a 60hz tv.

So, thoughts:
  • 4k DLSS Quality for the most part is indeed 60fps at 4k running through the city with my settings, fighting and indoors are no issue easily, but some cutscenes can drop into the 50's.
  • On that "for the most part": Even without a high CPU/GPU load (my CPU is 40-50 swinging through the city without RT), there are periodic frametime spikes of varying degrees which no settings can help with, as it doesn't look like any particular component is the bottleneck. Some of these are the aforementioned issue DF mentioned when rapidly switching views that can stress the culling system, so as running up a building then using a web to slingshot 180 degrees, it can cause a drop to 58/59 fps. The more annoying one though is just a ~60+ ms spike that occurs periodically swinging around, no obvious cause. Potential shader compiling, but there's no large CPU spike and they can occur running through the same area well after they're first encountered. Seen others with middling rigs complain about this on reddit threads too.
I'm definitely somewhat of an outlier with PC gaming as I play mostly on a 60hz TV, so this kind of thing has always been why I'm sometimes wary of PC benchmarks and how they might apply to my setup, even those that include "1% lows". A long enough data capture, especially with an uncapped framerate, and these hitches would barely register - but they are certainly noticeable during gameplay, especially on a fixed refresh rate display. I want consistency there, and I can't quite get it now - the fact that uncapped I could have a ~80 fps average or even a 60+ 1% low means little as these hitches announce themselves pretty prominently on a 60hz display. These aren't like every few seconds mind you, I have can have 2+ minute sections without seeing them, but they're frequent enough to be distracting.

(BTW on that note, should be mentioned as well that not until recently did I actually look at the benchmark video that computerbase.de was using for their CPU benchmarks, and it's largely an indoor environment with just strolling through the city - on foot - for the external portion. So uh, keep that in mind considering traversing the city can be a more stressful test for certain portions of this engine.)
  • Dynamic res is interesting: It actually tries to maintain a ~95% GPU usage at all times, so it can actually end up producing a more detailed image than if it was disabled. I think this is how it works on the PS5 as well. For example, setting DLSS quality at 4K without dynamic res when the GPU load is ~75% will actually produce a slightly inferior image to having dynamic res enabled which is kinda neat!
....but, dynamic res, as noted by Alex, can also introduce some drops as it can't react quickly enough (can be a problem with some DR implementations on PC I've found). When it works it's great, can barely notice any res change when the performance is kept at 60, but it's definitely not as flawless as a dynamic res implementation should be for it to make sense. I mean you want it enabled to avoid those occasional load spikes, if it can't adjust quickly enough and you get stuttering then it somewhat defeats the purpose. As DF mentioned though I think this can be largely fixed by just being a little more aggressive for lowering res, Titanfall 2 on the PC for example allows you to set the dynamic res threshold to 1 fps increments. If you set it at 60 you'll get stuttering as it tries for too consistently high a GPU load and can't keep up when there's a big GPU load spike, but if you lock your framerate to 60 externally then set the game's dynamic cap to say, 65fps, it has no issue and scales perfectly without stutter.
  • DLSS is noticeably sharper than IGT (Insomniac's temporal) on the PS5 or using it on the PC, but I'm not sure how much that is due to just being having the sharpening cranked up vs. actually reconstructing more accurately. I believe distant detail is definitely more visible and 'complete' with it, but it also produces more specular aliasing at times.
One of the reasons I loved Insomniac's implementation so much on the PS5 (at least in non-RT mode) was that it struck a great balance of being sharp but also cohesive. It produces an extremely stable image with very little reconstruction artifacts. DLSS seems to veer too much into sharpening atm, which can look better in some spots and does appear 'higher res', but can produce more shimmering when swinging around - this was evident even in youtube videos for me. Even at DLSS Quality, there are also a few instances where screen space reflections were actually slightly more accurately reconstructed on the PS5, such as seeing a wire structure reflected in a puddle - you could see more detail in the reflection on the PS5, albeit this is very slight though.

So over the course of many different scenarios throughout the city, using DLSS over Insomniac's scaling is actually not as cut-and-dried as I thought it would be. I mean I like sharp! That's why I usually played in Performance mode on the PS5 over RT, but specular aliasing really bugs me as well.
  • Definitely some issues with periodically loading textures. Buildings in the distance will sometimes have a lower res mip, and some elements on cutscenes are obviously not loading their highest detail as they should. Relatively few occurrences but they do stand out.
So overall on the non-RT image quality front, I'd say it's a mixed bag on my rig. At first glance when you boot it up, it's "Wow, DLSS is so much sharper!", but when you visit more locations and jump back and forth with it on the PS5 it's not so clear-cut (at least without RT, more on that below). The exception though is with dynamic res enabled - if you have a more powerful GPU than mine and say, set a 60fps cap, DLSS would likely look much better. I can test this by setting a dynamic 30fps cap, and while it doesn't infinitely up-res (my fps was still ~45-50 with this low cap), it obviously is constructing DLSS from a higher res than 4K because it looked noticeably better than the PS5 and even native TAA 4k.

As for RT, I only played around with that mode for about 10 mins, but with dynamic res at DLSS Quality/Balanced it actually holds 60fps surprisingly well, albeit with the aforementioned hitches explained above that occur without RT too. My CPU load was ~60% with a cap of 60, albeit I didn't explore every area (such as the notoriously heavy TImes Square). Compared to the PS5 RT Performance, DLSS provides a significantly more detailed image, dynamic or balanced DLSS without dynamic res enabled. But again, a downside too: DLSS with RT can also produce noticeably more flickering with RT reflections on some surfaces such as wet roads, especially if you're using DLSS performance mode. Surprisingly even FSR or IGT Performance mode doesn't exhibit these artifacts to the same degree. Compared to the PS5 RT Performance mode, it's significantly sharper yes, but then with dynamic res or locking to DLSS performance to keep it around 60 you're getting even more specular aliasing too than the level which bugged me without RT. Still superior to the PS5 overall visually I think, but some drawbacks.

Overall, it's...ok? A lot of good quality of life features as mentioned by DF, such as being able to change nearly every graphic setting and have it reflected instantly (PS5 when switching to RT mode forces you to a checkpoint restart), loads very quickly, can look superior to the PS5 version on my rig. But the framerate consistency in both RT and without is the most noticeable to me, so I'd have to still give it to the PS5 right now, which is actually somewhat rare when comparing versions of the same game on my PC vs PS5 when the PC version supports DLSS (Death Stranding is about the only other due to the frame pacing issue on the PC).

It's good in many ways, but I'll revisit it for subsequent patches. Outside of the RT CPU load everyone knows, they just gotta fix those texture mip issues, get rid of those periodic hitches, and perhaps give a DLSS sharpening slider. If they can manage those it would be a clear choice and elevate it to an exceptional port.

(Edit: jfc what a long post I need a life)

Great analysis. Really interesting observation about DRS scaling up image quality.

I assume if you are running at 4k with DLSSQ set (1440p internal) then enabling DRS allows the internal res to scale all the way up to 4K at which point you either have the equivalent of DLAA, or DLSS just does nothing (hopefully the former or else image quality would drop at 4K.

On the DLSS sharpening does anyone know if you can turn it down through Nvidia reflex?

As an aside, I'm not sure what frame pacing issues you're referring to on Death Stranding. I've put 100hrs plus in over the last few weeks on my 1070 and it's been smooth as butter the whole way. Brilliant game.
 
As a result though, PC will end up with better IQ at the cost of headroom for potential drops and responsiveness to those drops. I asked Nixxes though if they would consider just implementing the way DRS is done on PS5 as a different "aggressive" mode (as they actively changed DRS for PC so that it favours IQ in this granular way) and they said they were open to it.

It'd be great to have the option, although I assume with a VRR display these small drops wouldn't be particularly noticeable and so it can still be advantageous to have the better image quality.

Seems to me a sharpness slider is an equally important enhancement as I've seen a few complaints about the sharpness of DLSS/DLAA.
 
I assume if you are running at 4k with DLSSQ set (1440p internal) then enabling DRS allows the internal res to scale all the way up to 4K at which point you either have the equivalent of DLAA, or DLSS just does nothing (hopefully the former or else image quality would drop at 4K.

Yeah probably, not really sure how DLSS works with DRS so just guessing.

On the DLSS sharpening does anyone know if you can turn it down through Nvidia reflex?

Nah, never seen that work - Nvidia's sharpening through their CP is always additive, it can't remove sharpening that's added by the dev - it's why people were clamoring for God of War devs to add in a scale which they eventually did as it had the same issue at launch.

As an aside, I'm not sure what frame pacing issues you're referring to on Death Stranding. I've put 100hrs plus in over the last few weeks on my 1070 and it's been smooth as butter the whole way. Brilliant game.
Edit: Here's what I see (250mb mp4, def recommend downloading it and viewing it with your video player instead of viewing it inline, as through the browser it messes up the frame pacing itself and makes it appear far worse than it actually appears).

The last thread I contributed to it was here, there's been other threads on Reddit/Steam Forums I've seen over the years reporting on it from both versions as well. I don't know if it's restricted to gamepad users or people who play on 60hz displays, but there's been this consistent issue across 3 systems for me - different motherboards/CPU's/GPU's, much like that person experienced. I even went so far as to install Linux just to see if the Vulkan wrapper under Proton could fix it - nope (was actually worse).

Basically the frametime graph/fps doesn't budge from 60, but when you approach buildings or in your private room oddly enough, the game exhibits micro stutter. This kind of stutter is fine-grained enough that if you're using a mouse which can't get that perfect smooth consistent linear motion that a gamepad has you may not notice it when sweeping across the scene, add the nature of mouse movement to a VRR display and it's just likely difficult to detect for some higher-end users. That, and the intermittent nature of it - out in the wilderness I can go for 10+ minutes and not see it. I can also load up the game for the first time and be in front of a structure which caused this stutter and have it be fine - but walk 5 minutes in one direction then back, and it returns.

"Frame pacing" is basically just a placeholder term for that I think could be the issue here, but if you switch to photo mode in these problem areas, the stuttering is completely gone (despite the CPU load staying similar), so doesn't seem necessarily endemic to how the engine delivers frames (that and its sporadic nature).
 
Last edited:
An update btw: Last light played for about 90 mins more, and did not experience one stutter that wasn't related to just GPU load when flying around the city. So perhaps those intermittent stutters were indeed async shader compilations? When I first loaded up the game I went right into it and didn't sit at the menu for any background shader compiling to finish so perhaps that could have contributed to my initial experience.

The only other negative performance characteristics I have in non-RT mode is that in certain areas, such as the Financial district with lots of tall buildings, DLSS Quality at 4k is just too much for my 3060 with drops into the 50's when gliding over large swaths of a densely populated area, so dynamic res really is a must to maintain 60fps. The aforementioned DSR scaling issues can still exist but for the vast majority of the time it maintains a solid 60 for these areas as swinging around gives it enough time to compensate for the variable GPU demands.

Unfortunately in this area in particular, you really see that CPU culling problem when you're suddenly presented with an overview of the downtown core. This is not related to GPU load for the most part, it's when you're running up a building then suddenly get to the top and you're exposed to the skyline with all that detail that was just obscured, you can get several frames of stutter while the CPU tries to sort things out. When you're swinging between the buildings this is never an issue, so it's very location specific - in certain time trials you can get a little frantic chasing after a goal and be launching yourself up and over buildings and consistently get these 1-3 frame lurches which are annoying. In tests in the same area even the PS5 isn't immune to this, I can run up to some of the taller buildings and do a 180 when I get to the top and induce a couple moments of stutter, but I really have to work to replicate it whereas on the PC it's much more common.

Also at 1080p (and even 1440p!) without RT I can definitely be CPU limited. At 1080p swinging through downtown I basically top out at 75% GPU usage. Now I'm getting 75-100 fps so it's not like it's 'bad' performance mind you, but even at 1440p my GPU will not be fully utilized - again, this is without RT. So that, and the culling issue producing sub-60fps drops indicates to me the CPU demands of this game aren't just related to ray tracing. The CPU and GPU load are extremely variable depending on where you are on the map and what action you're doing, the problem with benchmarks is that they want to keep the movement as linear as possible for the sake of reproducibility when testing different components, but that ends up missing a lot of the stress areas that you will experience when performing certain actions in the game.
 
Back
Top