Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

So Gotham Knights doesnt run well on anything, 4090 included. On consoles it's 30 frames with constant stutter. On PC it stutters when driving. How is this game in such a shoddy state technically ? I mean, it doesnt run well on anything.


It's because it's current gen only. :runaway: State of the art man, and so much better than all those cross gen. games. :rolleyes:

Regards,
SB
 
He didn’t compare the 1440p mode as much cause it’s way more cpu limited which is a sepárate thing from gpu benches

When has he ever cared about doing GPU comparisons with mismatched CPU's? Especially when it gives the advantage to the PS5 (see the previous Spiderman video). And in any case I don't buy that this game would be CPU limited on a Zen 2 or even Zen+ at a 60fps target. Even the Jaguars of PS4 have been shown to run it near 60fps and we see the PS5 approaching 120 fps in the fully unlocked mode.

You can argue that the 4K comparison isn't a GPU test at all - it's a VRAM test. And therefore the less VRAM heavy 1440p mode would actually be a better GPU test.


It'd be interesting to see what an external frame rate limiter like RTSS can do for this game.

Did I say shit it? I meant ship it. My bad,

I actually thought you mean shit it. Totally appropriate :ROFLMAO:
 
Plague tale Requiem is a very heavy game 1440p 30/40 fps on XSX and PS5 and 1080p 30/40 fps on XSS and the performance on PC show it too. The games run better on Xbox consoles. The game is GPU bound and the game don't use raytracing. A raytracing patch will arrive on PC. But nothing a 4090 can't solve. I think this will be the only GPU to be 4k 60+ fps with the raytracing patch.



Long time I can't wait a Digitalfoundry video for performance analysis but I was curious seeing the performance. Now I wait DF coverage.

EDIT: And the funny part is the 4090 GPU bound or CPU bound the GPU is so powerful.

I got this today and it is very heavy but it's easy to see why, there's so much alpha on the screen when you're out in the open.

My 3060ti with all settings on high at native 1440p hits over 60fps when there little to no alpha on screen and I've seen it go as low as 33fps when there's a crap load of it on screen.

My 3060ti isn't boosting as high as it normally does either as it's ~200mhz lower than normal.

One of the reasons it looks so good is because it uses screen space shadows, a technique I'm a huge fan of as it just adds so much depth to everything.

The asset quality so far is just high quality and super consistent so I'm not shocked the consoles are capped to 30fps.
 
Last edited:
Runs and looks great on my 1070. I've got it locked through RTSS to 34fps amd for the most part frame times.are just a perfectly straight line. There's the odd spike in heavy areas but nothing particularly noticeable. That's at all high except shadows as medium and 3840x1600 at 50% resolution scale.

Given the low res, image quality is surprisingly very good. This game must have a great TAA implementation, and I think the art style is well suited to a softer image.

It's amazing how well games scale these days. I guess that's a symptom of them targetting 4k at the high end which leaves tons of resolution headroom at the lower end while still maintaining acceptable image quality.

Never thought my mid PS4 gen, non top of the line GPU would still be going strong 2 years into the PS5 gen.
 
When has he ever cared about doing GPU comparisons with mismatched CPU's? Especially when it gives the advantage to the PS5 (see the previous Spiderman video). And in any case I don't buy that this game would be CPU limited on a Zen 2 or even Zen+ at a 60fps target. Even the Jaguars of PS4 have been shown to run it near 60fps and we see the PS5 approaching 120 fps in the fully unlocked mode.

You can argue that the 4K comparison isn't a GPU test at all - it's a VRAM test. And therefore the less VRAM heavy 1440p mode would actually be a better GPU test.



It'd be interesting to see what an external frame rate limiter like RTSS can do for this game.



I actually thought you mean shit it. Totally appropriate :ROFLMAO:
But it’s not targetting 60fps in the 1440p mode it’s targetting 120 and it definitely will be cpu limited in that case
 
But it’s not targetting 60fps in the 1440p mode it’s targetting 120 and it definitely will be cpu limited in that case

This makes no sense. There's a 1080p mode which has much higher frame rates. Ergo the CPU is not the limit at 1440p, the GPU is.

And besides, against his typical 2700x, if anything it would be the PC that's CPU limited there, not the PS5.
 
Persona 5 Royal on Steam is sublime. Silky smooth.. The style and animation in this game is perfect for high refresh rates.

I'm honestly pretty darn happy right now. Uncharted Collection and Persona 5 both turned out to be great ports. Spider-Man: Miles Morales is coming, and Vampire Survivors is updated to 1.0..

RE4 Remake was shown off and looks incredible. God of War Ragnarok previews are hitting and that looks like a serious GOTY competitor to Elden Ring! Can't wait for that.
 
Persona 5 Royal on Steam is sublime. Silky smooth.. The style and animation in this game is perfect for high refresh rates.

I'm honestly pretty darn happy right now. Uncharted Collection and Persona 5 both turned out to be great ports. Spider-Man: Miles Morales is coming, and Vampire Survivors is updated to 1.0..

RE4 Remake was shown off and looks incredible. God of War Ragnarok previews are hitting and that looks like a serious GOTY competitor to Elden Ring! Can't wait for that.
Uncharted is a lot of things but a great port isn’t one of them.
 
Uncharted is a lot of things but a great port isn’t one of them.
My standards for a "great" port basically means no shader compilation stuttering, and that the game functions as it should.

This Uncharted release ticks the boxes. Anything above and beyond that would be an "awesome" port lol.

But no worries, I realize not everyone shares my opinion. There's a lot of games out there that do go above and beyond and add a lot of bespoke PC features and flourishes... but they don't hit the most basic "function as it should" metric.. which is far and away the most important thing to me.
 
My standards for a "great" port basically means no shader compilation stuttering, and that the game functions as it should.

This Uncharted release ticks the boxes. Anything above and beyond that would be an "awesome" port lol.

Well I think that's a bit generous, but I can understand it. I'm first and foremost about frame consistency as well.

Most of the tech reviews of this title felt a little lacking in answering my specific concerns, so curiosity got the better of me and as such, I temporarily nabbed (ahem) a copy of UC: Legacy Of Thieves to just see on a technical level how it fares on my 3060/I5-12400 system, esp compared to my PS5 copy.

The Good:

  • Indeed, that framerate consistency! I've played for about 90 minutes, and who knows beyond that (no driving sections yet) - but so far, I have not seen one stutter, frame skip, blip, bad frame pacing etc that was not attributed to GPU load (which always reaches full load too btw, no weird vsync issues). I'd actually say it's one of the most consistent titles in delivering frames that I've played. Now here's the thing though...I started it without letting the shader compile finish just to see what effect it had. Absolutely none. They continued to process in the background, but that's precisely it - the background. Now granted my target was 60fps so perhaps that would reveal issues with going for a locked 120, but it's also just an i5-12400 too.. My fear that you would be waiting '20 minutes after every driver update' appears largely unfounded. CPU load in total was around 35-40% while this was happening, so it does look like they took special care here. Now maybe if you jump back into a fierce firefight level after a driver upgrade this may not be the case, but then again while Lost Legacy has you slowly strolling through a market from the start, UC4 throws you right into a speedboat firefight on raging seas, which had no issues when shaders were only 15% done at that point. You will get a longer initial load if you don't wait for the shaders to finish sure, but even that's very minor- that's a one-time load when loading up the game, even if you skip cutscenes, following scenes can take just a few secs - if you even see a loading screen at all. This is from a SATA SSD as well.
  • UC4 is able (so far) to deliver, 60fps at 4K on high/Ultra (textures/aniso) with DLSS performance through about 95% of my run, so better than I expected. The opening scene with your boat in a storm, being fired upon with tons of alpha effect? Locked. Running through the prison riot with huge explosions, no problem. Large gang fight? Close ups of characters being driven face-first into the dirt? Solid 60. A few scenes early in the orphanage, and some scenes underwater and with foliage outside at certain camera angles, I saw drops in the 50's, but very brief. There's no wild swings in terms of load just because you turned a corner, even in CPU load it's remarkably consistent, it uses 32-37% of my i5 constantly, I don't think I've seen a game deviate so little through so many different scenes. Again I'm relatively early, and note - I'm specifying UC4, not Lost Legacy, it's sister (and better, imo) campaign. More on that later.
  • Immediate switch of iconography if you move to Xbox/PS4 controllers. A minor benefit but some games do get confused, not here.

The Middling:

  • Lost Legacy Performance. I've only just started the opening scene and walked around in the alleyways, but this is definitely heavier than UC4. (UPDATE: Initially with DLSS performance at 4K I was reporting framerates in the mid 30's, but I reloaded and now get in the mid 50's in the market, and 60's once outside of it. So not amazing, but certainly better than before.) So heavier than UC4, but then I could then just lower my res to my usual 1800p when 4K is too much. But there's a problem with that:
  • Tying into the above - no fullscreen exclusive mode. Now I know that's not exactly uncommon with DX12 titles, but this goes one step further on the annoyance scale as you cannot change your resolution from your displays default. Ok, so just use the render scale slider, that's what it's for! Well, two problems with that:
    • You cannot use render scale if you also use DLSS or FSR. So on a 4K display, the DLSS Ultra performance derived from your native res the best you can get. Sure, you can get 60fps with that. It of course, looks like shit. DLSS performance isn't fast enough, so you're kinda sol.
    • If you don't use DLSS, then the render scale automatically uses TAAU. So a render scale of 50%, the lowest it can go, is even worse in framerate than DLSS performance.
    • So the only solution to then get the performance up if scaling/DLSS isn't doing it is by changing your desktop resolution in order to use a non-reconstructed res to start with, kinda bush league stuff here. The only bright spot is being that since it's a borderless window, you can just alt-tab to your desktop and change it, then flip back in - the game picks it up immediately. This really shouldn't be necessary though.
  • DLSS. In most cases, it's very good. DLSS performance at 4K is a noticeable step up from PS5 performance mode (1440p) in image quality. 1440p with DLSS Quality? Ehhhh maybe not. There still are some bugs regardless of the starting res or what DLSS quality mode you use that don't appear at all from the equivalent native res DLSS starts from, sometimes it can really freak out which usually manifests in a sudden spurt of white pixels flickering - like in a scene when escaping the orphanage involving glow from the moon hitting a chain link fence. These were specific scenes and very brief, but there were distracting when they occurred. If that was the extent of it considering the rest of its quality though, the DLSS implementation might actually be in the 'good' category. Unfortunately...

The Ugly:

  • DLSS affects shadow quality. Surprised how so many have missed this, it's pretty evident when you're switching DLSS on/off even when the graphics menu blurs the background, you can see the scene dim with more detailed shadows without DLSS immediately. Don't know if this is just another DLSS LOD issue, albeit no indication it's happening with textures so doubt it. Unlike the other artifacting problems though, DLSS quality modes do at least affect this, which means that DLSS performance shows this far more noticeably than DLSS quality - but this should definitely not be happening regardless of mode. It's also amplified by the fact the shadows were probably the one area that needed the most improvement - even Ultra shadows without DLSS is an extremely minor upgrade over the PS5, there are still scenes with blocky flickering shadows and shadow banding. With DLSS though, even in Quality mode - you're getting worse shadows than default. Again, my skepticism whenever I hear "Just turn DLSS on, no reason not to!". 🙄
  • 30fps option has, not surprisingly, shit frame pacing. Yeah this is not something most members of this forum will ever use, and I know it's rare for PC games to ever get this right to begin with, but it never makes sense to include it and not care about it - most people who are going to be using such an locked option will want that frame delivery to be consistent if you're going to settle for such a low framerate. Thankfully, with Rivatuner and a scanline Sync/X2 set to 1, that does remedy it in my tests. But that usually adds more latency than the game's own offering, so get this right devs if you bother to include it, don't just set a frame cap and call it a day. Double buffer that shit.
  • Aside from the DLSS bugs, some random lighting bugs. For example as young drake in a room, a random spotlight kept appearing on the floor. These are rare so far though.
  • Scaling of graphics settings. As DF mentioned, offer very little for both image quality and any performance gains. The worst of both worlds.
So yeah, they gotta fix that DLSS shadow & white pixel problem*, along with the motion blur & LOD that Alex covered. There's work to be done no doubt.

*See posts a few down. This does not occur under FSR.

But like @Remij said, the stating base is actually very solid. Despite the egregious GPU load, I would still rate this a 'good' port atm - with the caveats that they fix those bugs of course. If they don't, well then no.

It may not scale as high as expected given the GPU grunt it needs, and older cards are not delivering the performance they should either, sure. But it's also extremely consistent, it's a title that's going to shine as you upgrade your hardware, your thought won't be "I really hope this upgrade resolves those frame skips and stutters", it will just be "Cool, I get better performance at higher res now". That's what you should expect.
 
Last edited:
Well I think that's a bit generous, but I can understand it. I'm first and foremost about frame consistency as well.

Most of the tech reviews of this title felt a little lacking in answering my specific concerns, so curiosity got the better of me and as such, I temporarily nabbed (ahem) a copy of UC: Legacy Of Thieves to just see on a technical level how it fares on my 3060/I5-12400 system, esp compared to my PS5 copy.

The Good:

  • Indeed, that framerate consistency! I've played for about 90 minutes, and who knows beyond that (no driving sections yet) - but so far, I have not seen one stutter, frame skip, blip, bad frame pacing etc that was not attributed to GPU load (which always reaches full load too btw, no weird vsync issues). I'd actually say it's one of the most consistent titles in delivering frames that I've played. Now here's the thing though...I started it without letting the shader compile finish just to see what effect it had. Absolutely none. They continued to process in the background, but that's precisely it - the background. Now granted my target was 60fps so perhaps that would reveal issues with going for a locked 120, but it's also just an i5-12400 too.. My fear that you would be waiting '20 minutes after every driver update' appears largely unfounded. CPU load in total was around 35-40% while this was happening, so it does look like they took special care here. Now maybe if you jump back into a fierce firefight level after a driver upgrade this may not be the case, but then again while Lost Legacy has you slowly strolling through a market from the start, UC4 throws you right into a speedboat firefight on raging seas, which had no issues when shaders were only 15% done at that point. You will get a longer initial load if you don't wait for the shaders to finish sure, but even that's very minor- that's a one-time load when loading up the game, even if you skip cutscenes, following scenes can take just a few secs - if you even see a loading screen at all. This is from a SATA SSD as well.
  • UC4 is able, so far, to deliver, 60fps at 4K on high/Ultra (textures/aniso) with DLSS performance through about 95% of my run so far. Far better than I expected. The opening scene with your boat in a storm, being fired upon with tons of alpha effect? Locked. Running through the prison riot with huge explosions, no problem. Large gang fight? Close ups of characters being driven face-first into the dirt? Solid 60. A few scenes early in the orphanage, and some scenes underwater and with foliage outside at certain camera angles, I saw drops in the 50's, but very brief. There's no wild swings in terms of load just because you turned a corner, even in CPU load it's remarkably consistent, it uses 32-37% of my i5 constantly, I don't think I've seen a game deviate so little through so many different scenes. Again I'm relatively early, and note - I'm specifying UC4, not Lost Legacy, it's sister (and better, imo) campaign. More on that later.
  • Immediate switch of iconography if you move to Xbox/PS4 controllers. A minor benefit but some games do get confused, not here.

The Middling:

  • Lost Legacy Performance. I've only just started the opening scene and walked around in the alleyways, but this is far more heavy than UC4. The opening scene in the market has me at the high 30's with 4K and DLSS Performance (Chloe's expression there mirrored mine at that point). Even going on from that into far more restricted areas, it can't reach 60, albeit in the 50's at least. 1800p with DLSS performance? Maybe. But really 1440p with DLSS Quality for a locked 60, which makes this by far the game with the widest disparity between the PS5 and my 3060 for basic raster performance (the PS5 is a locked 60 in those areas at 1440p, but not sure how much farther it could go). Still, it's not a disaster as there's still that great framerate consistency - you can 'solve' this by just running at a lower resolution than you may have liked perhaps, that's why even though the demands are egregious, it's not disastrous imo. Stuff like early Elden Ring and HZD stuttering were insurmountable at the start on almost all machines. This? Just lower your res.
  • Tying into the above - no fullscreen exclusive mode. Now I know that's not exactly uncommon with DX12 titles, but this goes one step further on the annoyance scale as you cannot change your resolution from your displays default. Ok, so just use the render scale slider, that's what it's for! Well, two problems with that:
    • You cannot use render scale if you also use DLSS or FSR. So on a 4K display, the DLSS Ultra performance derived from your native res the best you can get. Sure, you can get 60fps with that. It of course, looks like shit. DLSS performance isn't fast enough, so you're kinda sol.
    • If you don't use DLSS, then the render scale automatically uses TAAU. So a render scale of 50%, the lowest it can go, is even worse in framerate than DLSS performance.
    • So the only solution to then get the performance up if scaling/DLSS isn't doing it is by changing your desktop resolution in order to use a non-reconstructed res to start with. So with LL, I'll have to change my desktop res if I want to play the game at 60 on my system, kinda bush league stuff here. The only bright spot is being that since it's a borderless window, you can just alt-tab to your desktop and change it, then flip back in - the game picks it up immediately. This really shouldn't be necessary though.
  • DLSS. In most cases, it's very good. DLSS performance at 4K is a noticeable step up from PS5 performance mode (1440p) in image quality. 1440p with DLSS Quality? Ehhhh maybe not. There still are some bugs regardless of the starting res or what DLSS quality mode you use that don't appear at all from the equivalent native res DLSS starts from, sometimes it can really freak out which usually manifests in a sudden spurt of white pixels flickering - like in a scene when escaping the orphanage involving glow from the moon hitting a chain link fence. These were specific scenes and very brief, but there were distracting when they occurred. If that was the extent of it considering the rest of its quality though, the DLSS implementation might actually be in the 'good' category. Unfortunately...

The Ugly:

  • DLSS affects shadow quality. Surprised how so many have missed this, it's pretty evident when you're switching DLSS on/off even when the graphics menu blurs the background, you can see the scene dim with more detailed shadows without DLSS immediately. Don't know if this is just another DLSS LOD issue, albeit no indication it's happening with textures so doubt it. Unlike the other artifacting problems though, DLSS quality modes do at least affect this, which means that DLSS performance shows this far more noticeably than DLSS quality - but this should definitely not be happening regardless of mode. It's also amplified by the fact the shadows were probably the one area that needed the most improvement - even Ultra shadows without DLSS is an extremely minor upgrade over the PS5, there are still scenes with blocky flickering shadows and shadow banding. With DLSS though, even in Quality mode - you're getting worse shadows than default. Again, my skepticism whenever I hear "Just turn DLSS on, no reason not to!". 🙄
  • 30fps option has, not surprisingly, shit frame pacing. Yeah this is not something most members of this forum will ever use, and I know it's rare for PC games to ever get this right to begin with, but it never makes sense to include it and not care about it - most people who are going to be using such an locked option will want that frame delivery to be consistent if you're going to settle for such a low framerate. Thankfully, with Rivatuner and a scanline Sync/X2 set to 1, that does remedy it in my tests. But that usually adds more latency than the game's own offering, so get this right devs if you bother to include it, don't just set a frame cap and call it a day. Double buffer that shit.
  • Aside from the DLSS bugs, some random lighting bugs. For example as young drake in a room, a random spotlight kept appearing on the floor. These are rare so far though.
  • Scaling of graphics settings. As DF mentioned, then do very little for both image quality and any performance gains. The worst of both worlds.
So yeah, they gotta fix that DLSS shadow & white pixel problem, along with the motion blur & LOD that Alex covered. There's work to be done no doubt.

But like @Remij said, the stating base is actually very solid. Despite the egregious GPU load, I would still rate this a 'good' port atm - with the caveats that they fix those bugs of course.

It may not scale as high as expected given the GPU grunt it needs (esp Lost Legacy), and older cards are not delivering the performance they should either, sure. But it's also extremely consistent, it's a title that's going to shine as you upgrade your hardware, your thought won't be "I really hope this upgrade resolves those frame skips and stutters", it will just be "Cool, I get better performance at higher res now". That's what you should expect.
PS5 averages about 100 in 1440p uncapped performance mode with brief dips to the 60s or 70s in the worst spots.

 
Last edited:
PS5 averages about 100 in 1440p uncapped performance mode with brief dips to the 60s or 70s in the worst spots.


Yeah I know, I've mentioned it numerous times. Not sure what your point is as you're not replying to anything specific. As I said repeatedly, the GPU demands of this title are way out of whack with 99% of ports I've seen wrt the GPU you need for the equivalent console experience, but I can still appreciate a title that's at least consistent in its frame delivery.
 
Yeah I know, I've mentioned it numerous times. Not sure what your point is as you're not replying to anything specific. As I said repeatedly, the GPU demands of this title are way out of whack with 99% of ports I've seen wrt the GPU you need for the equivalent console experience, but I can still appreciate a title that's at least consistent in its frame delivery.
You mentioned not knowing how far above 60 PS5 goes.
 
You mentioned not knowing how far above 60 PS5 goes.

Dude, please quote the specific point you want to address.

  • Lost Legacy Performance. I've only just started the opening scene and walked around in the alleyways, but this is far more heavy than UC4. The opening scene in the market has me at the high 30's with 4K and DLSS Performance (Chloe's expression there mirrored mine at that point) - worse than the PS5 at 4k (!). Even going on from that into far more restricted areas, it can't reach 60, albeit in the 50's at least. 1800p with DLSS performance? Maybe. But really 1440p with DLSS Quality for a locked 60, which makes this by far the game with the widest disparity between the PS5 and my 3060 for basic raster performance (the PS5 is a locked 60 in those areas at 1440p, but not sure how much farther it could go).

I was referring to the opening scenes of The Lost Legacy in particular. Since I don't have a VRR set I can't see the uncapped performance for those exact scenes. I was just thinking about how far out of whack the PC would be in this scenes since there were so much heavier than UC4.

However, looks like it's the ol 'reboot the game after fucking with graphics settings too much' problem. Just reloaded and checked. In that market area with DLSS performance at 4k, now it's the low to mid 50's. That's still not great for DLSS performance of course, but certainly not as bad as the mid 30's I was getting earlier. Areas in the alley, through buildings on the roofs are a solid 60.
 
Last edited:
Last edited:
  • ...but this goes one step further on the annoyance scale as you cannot change your resolution from your displays default.
...
  • ...Again, my skepticism whenever I hear "Just turn DLSS on, no reason not to!". 🙄

It boggles my mind anytime I run into a game on PC where they don't allow you to manually change resolution. At least in most of those games in windowed mode, they'll at least allow you to change the size of the window and thus the render resolution, but even that isn't necessarily a given and sometimes you'll be stuck with a 1920x1080 window on a 4k screen. Grrrr. Worse still are games like this that auto-default to native desktop resolution in borderless windowed mode.

And yeah, agreed on the DLSS. It's fantastic tech, but I just can't use it in most games due to the annoying artifacts or visual downgrades that it introduces in most games. That said, I'm always interested when this type of tech (DLSS, FSR, etc.) improves. Someday I hope it gets to a place where I won't physically cringe when enabling it as tech like this is likely the future due to rapidly increasing costs associated with silicon node shrinks.

Regards,
SB
 
And yeah, agreed on the DLSS. It's fantastic tech, but I just can't use it in most games due to the annoying artifacts or visual downgrades that it introduces in most games. That said, I'm always interested when this type of tech (DLSS, FSR, etc.) improves. Someday I hope it gets to a place where I won't physically cringe when enabling it as tech like this is likely the future due to rapidly increasing costs associated with silicon node shrinks.

Regards,
SB

I'm not quite at that level of annoyance with it. I certainly think DLSS is worth it over the equivalent native res you have to drop down to in order to get the same level of performance, and usually better than other tech like checkerboarding, and can even have aspects better than some games native TAA (I mean I wouldn't put it past Capcom to royally fuck up the implementation but I would bet DLSS/FSR would be a big improvement over ReEngine's awful TAA). It's just when it 'breaks', it does so in a way that's pretty glaring to me. Part of it may be the juxtaposition against the rest of the image which can look excellent, but it's not like the problem is a certain effect isn't being scaled up from its native res as gracefully as other elements, it's that it's creating wholly new issues that don't exist in 'dumb' upscaling. Basically sometimes DLSS is trying to interpret something and just losing the plot.

As those videos of UC4 DLSS vs FSR showed though, many of these glaring artifacts can likely be addressed, this isn't necessarily an impossible hill to climb for reconstruction - even if the miracle breakthrough of machine learning isn't involved. :) Those similar issues of low-res buffer artifacts have occurred in other titles, particularly with performance mode DLSS, so I'm not sure all of these issues in UC4 are entirely the fault of Iron Galaxy here - guess we'll see with future patches.

This spurned me to just do a test by using the FSR 2.1 mod for Horizon Zero Dawn. While it had it's own particular artifacts over straight DLSS (hair, ghosting on some grass - it is a 'mod' mind you), it also didn't have those issues within Cauldrons and the particularly annoying low-res specular shimmering with some objects behind certain steam/smoke effects. So I'm inclined to believe these faults may be at least partly falling on Nvidia's side of the aisle, whether it's a developer education issue or something inherent to how FSR deals more gracefully with low-res buffer effects vs DLSS when devs miss something I don't know, have to see more examples. But getting less artifacting in one area through what is basically a hack for a game was certainly interesting.

I'm just annoyed I usually only discover these for myself, I'm probably more critical than many but at the same time I'm not exactly zooming in to see these either. But man, the shadow problem in UC4 with DLSS (and FSR to a slightly lesser extent) is pretty damned glaring:

PS5 Shadows 1440p
PC Shadows DLSS 4k Performance

😟
 
Last edited:
Well I think that's a bit generous, but I can understand it. I'm first and foremost about frame consistency as well.

Most of the tech reviews of this title felt a little lacking in answering my specific concerns, so curiosity got the better of me and as such, I temporarily nabbed (ahem) a copy of UC: Legacy Of Thieves to just see on a technical level how it fares on my 3060/I5-12400 system, esp compared to my PS5 copy.

The Good:

  • Indeed, that framerate consistency! I've played for about 90 minutes, and who knows beyond that (no driving sections yet) - but so far, I have not seen one stutter, frame skip, blip, bad frame pacing etc that was not attributed to GPU load (which always reaches full load too btw, no weird vsync issues). I'd actually say it's one of the most consistent titles in delivering frames that I've played. Now here's the thing though...I started it without letting the shader compile finish just to see what effect it had. Absolutely none. They continued to process in the background, but that's precisely it - the background. Now granted my target was 60fps so perhaps that would reveal issues with going for a locked 120, but it's also just an i5-12400 too.. My fear that you would be waiting '20 minutes after every driver update' appears largely unfounded. CPU load in total was around 35-40% while this was happening, so it does look like they took special care here. Now maybe if you jump back into a fierce firefight level after a driver upgrade this may not be the case, but then again while Lost Legacy has you slowly strolling through a market from the start, UC4 throws you right into a speedboat firefight on raging seas, which had no issues when shaders were only 15% done at that point. You will get a longer initial load if you don't wait for the shaders to finish sure, but even that's very minor- that's a one-time load when loading up the game, even if you skip cutscenes, following scenes can take just a few secs - if you even see a loading screen at all. This is from a SATA SSD as well.
  • UC4 is able (so far) to deliver, 60fps at 4K on high/Ultra (textures/aniso) with DLSS performance through about 95% of my run, so better than I expected. The opening scene with your boat in a storm, being fired upon with tons of alpha effect? Locked. Running through the prison riot with huge explosions, no problem. Large gang fight? Close ups of characters being driven face-first into the dirt? Solid 60. A few scenes early in the orphanage, and some scenes underwater and with foliage outside at certain camera angles, I saw drops in the 50's, but very brief. There's no wild swings in terms of load just because you turned a corner, even in CPU load it's remarkably consistent, it uses 32-37% of my i5 constantly, I don't think I've seen a game deviate so little through so many different scenes. Again I'm relatively early, and note - I'm specifying UC4, not Lost Legacy, it's sister (and better, imo) campaign. More on that later.
  • Immediate switch of iconography if you move to Xbox/PS4 controllers. A minor benefit but some games do get confused, not here.

The Middling:

  • Lost Legacy Performance. I've only just started the opening scene and walked around in the alleyways, but this is definitely heavier than UC4. (UPDATE: Initially with DLSS performance at 4K I was reporting framerates in the mid 30's, but I reloaded and now get in the mid 50's in the market, and 60's once outside of it. So not amazing, but certainly better than before.) So heavier than UC4, but then I could then just lower my res to my usual 1800p when 4K is too much. But there's a problem with that:
  • Tying into the above - no fullscreen exclusive mode. Now I know that's not exactly uncommon with DX12 titles, but this goes one step further on the annoyance scale as you cannot change your resolution from your displays default. Ok, so just use the render scale slider, that's what it's for! Well, two problems with that:
    • You cannot use render scale if you also use DLSS or FSR. So on a 4K display, the DLSS Ultra performance derived from your native res the best you can get. Sure, you can get 60fps with that. It of course, looks like shit. DLSS performance isn't fast enough, so you're kinda sol.
    • If you don't use DLSS, then the render scale automatically uses TAAU. So a render scale of 50%, the lowest it can go, is even worse in framerate than DLSS performance.
    • So the only solution to then get the performance up if scaling/DLSS isn't doing it is by changing your desktop resolution in order to use a non-reconstructed res to start with, kinda bush league stuff here. The only bright spot is being that since it's a borderless window, you can just alt-tab to your desktop and change it, then flip back in - the game picks it up immediately. This really shouldn't be necessary though.
  • DLSS. In most cases, it's very good. DLSS performance at 4K is a noticeable step up from PS5 performance mode (1440p) in image quality. 1440p with DLSS Quality? Ehhhh maybe not. There still are some bugs regardless of the starting res or what DLSS quality mode you use that don't appear at all from the equivalent native res DLSS starts from, sometimes it can really freak out which usually manifests in a sudden spurt of white pixels flickering - like in a scene when escaping the orphanage involving glow from the moon hitting a chain link fence. These were specific scenes and very brief, but there were distracting when they occurred. If that was the extent of it considering the rest of its quality though, the DLSS implementation might actually be in the 'good' category. Unfortunately...

The Ugly:

  • DLSS affects shadow quality. Surprised how so many have missed this, it's pretty evident when you're switching DLSS on/off even when the graphics menu blurs the background, you can see the scene dim with more detailed shadows without DLSS immediately. Don't know if this is just another DLSS LOD issue, albeit no indication it's happening with textures so doubt it. Unlike the other artifacting problems though, DLSS quality modes do at least affect this, which means that DLSS performance shows this far more noticeably than DLSS quality - but this should definitely not be happening regardless of mode. It's also amplified by the fact the shadows were probably the one area that needed the most improvement - even Ultra shadows without DLSS is an extremely minor upgrade over the PS5, there are still scenes with blocky flickering shadows and shadow banding. With DLSS though, even in Quality mode - you're getting worse shadows than default. Again, my skepticism whenever I hear "Just turn DLSS on, no reason not to!". 🙄
  • 30fps option has, not surprisingly, shit frame pacing. Yeah this is not something most members of this forum will ever use, and I know it's rare for PC games to ever get this right to begin with, but it never makes sense to include it and not care about it - most people who are going to be using such an locked option will want that frame delivery to be consistent if you're going to settle for such a low framerate. Thankfully, with Rivatuner and a scanline Sync/X2 set to 1, that does remedy it in my tests. But that usually adds more latency than the game's own offering, so get this right devs if you bother to include it, don't just set a frame cap and call it a day. Double buffer that shit.
  • Aside from the DLSS bugs, some random lighting bugs. For example as young drake in a room, a random spotlight kept appearing on the floor. These are rare so far though.
  • Scaling of graphics settings. As DF mentioned, offer very little for both image quality and any performance gains. The worst of both worlds.
So yeah, they gotta fix that DLSS shadow & white pixel problem*, along with the motion blur & LOD that Alex covered. There's work to be done no doubt.

*See posts a few down. This does not occur under FSR.

But like @Remij said, the stating base is actually very solid. Despite the egregious GPU load, I would still rate this a 'good' port atm - with the caveats that they fix those bugs of course. If they don't, well then no.

It may not scale as high as expected given the GPU grunt it needs, and older cards are not delivering the performance they should either, sure. But it's also extremely consistent, it's a title that's going to shine as you upgrade your hardware, your thought won't be "I really hope this upgrade resolves those frame skips and stutters", it will just be "Cool, I get better performance at higher res now". That's what you should expect.

Awesome analysis. Would be great to see how it performs against the PS5 without upscaling though using Alex's matched settings.

We have NXGs and one or two other videos linked in this thread showing the PS5 unlocked performance in specific scenes at both 4k and 1080p which would allow for direct comparisons. Your 3060 is virtually identical to his 2070 except you're not VRAM limited so it would make a great test to see whether NXG's comparisons at 4k are significantly impacted by VRAM or not.
 
Back
Top