What a plot twist. This is shaping up to become one of the best PC ports.
@davis.anthony I think DX12 gets blamed way too much for game performance. There's a lot more to a game engine than submitting render calls to an API.
So from a pure rasterization performance perspective, the game is still wildly out of sync - to reach the PS5's unlocked framerate at 1440p without upscaling you still need something on the order of a 3080+ class card.
It is worth remembering that TLOU is a port of a highly tuned and optimised PS5 exclusive from one of Sony's most talented developers.
So PS5's GPU will be punching above its weight compared to your average third party game.
But that simply means it's not as well, or even close to as well optimised for the PC, which in turn precludes it from being considered a particularly good port.
PS5 is native 1440p/60fps with high settings and if you look at PC GPU benchmarks for GPU's around it's level they're not actually that far off PS5.
I'm expecting Ratchet & Clank to be no different when that finally hits on PC as there's only much you can do on PC to optimise.
There are some extremely rare dips below 60fps as DF's showed with TLOU on the PS5, but it also has an unlocked mode which shows the majority of it is well over 60fps. No, it's definitely quite far off from the PS5 in GPU performance. You really need a 3080-class GPU to compete without using upscaling.
A 3060ti at 1080p high can still have drops under 60fps. Even if the PS5 was limited to 60fps at 1440p, that would still be a massive performance disparity from virtually every other game - save, perhaps not surprisingly, from Uncharted. That's getting into 2X 3060ti's performance here.
I don't expect the 2070 Super to be the defacto comparison card for every game going forward during the entirety of the PS5's lifespan (even outside of VRAM limitations), but if TLOU's GPU performance profile becomes the norm, that would be...not good. It would speak quite poorly of the PC's value proposition if you needed 3080-class cards to compete in most ports, as opposed to them blowing past the PS5 in 99% of games today.
First, please forgive the high luminance or brightness look exhibited in the images, as converting HDR images (from jxr to jpg) can be a bit problematic. So, I’ve included a zip file for those wanting to see the original HDR images when using a proper image viewer like Windows Photos or HDR+WCG Viewer.
Anyhow, I wanted to see how much of an image and performance difference there was with the new patch, if any, between the game’s default resolution render and Nvidia’s DLSS Super Resolution render. As you will see below, the game's default resolution render uses 15.79% more memory on average when compared to Nvidia’s DLSS (which rarely goes over 11GB, making it perfect for those 12GB graphic cards owners). The image quality differences are barely perceptible at times, however, DLSS does have a slight edge when looking at the brick patterns or delineated borders in set 4 images, or the cleaner image pattern in the overhead fluorescent light modules in set 6 images. And of course, DLSS has the edge in performance, with a lower GPU utilization to boot.
That's quite a low CPU clock by todays standards, what CPU is that?
I don't see the point in using this 2 month old video as the data is so out of date now it doesn't represent the game any more.
The image quality differences are barely perceptible at times, however, DLSS does have a slight edge when looking at the brick patterns or delineated borders in set 4 images, or the cleaner image pattern in the overhead fluorescent light modules in set 6 images. And of course, DLSS has the edge in performance, with a lower GPU utilization to boot.
The other link I provided showing the 3060ti failing to hold 60fps at 1080p was done with most recent patch.
Regardless, as I said in the post you replied to and others have also confirmed, GPU performance was not affected by this patch, that aspect of the games performance is what we're discussing - your comment that PC GPU's were actually 'not far off' from the PS5' in this game was what I was replying to. That's just not the case.
As such, even though it was from the initial launch, the performance discrepancy in GPU limited scenarios between the PC and the PS5 has not changed to any meaningful degree since Alex's first look, so that video is perfectly valid for what we're discussing. The improvements have been in texture streaming, bugs, and CPU usage. I linked to that specific section of the video where Alex talked about the GPU performance disparity and showed the screenshot from that section highlighting the rendering bottleneck specifically for that reason - that's not a CPU-limited scenario. That gulf has not changed, it's a GPU hog just as it was Day 1, drastically out of sync with other releases in this aspect.
DLSS Quality mode at 1440p is certainly better than DLSS performance at 1440p which is what I have to resort to currently, but daytime scenarios with still images do not exactly give the most thorough picture of how it behaves in-game - if those were emblematic of how DLSS performed in this game at sub-4k output resolutions, it would be a great implementation.
The problems occur at night, specifically the post-process effect of the flashlight hotspot can cause some significant artifacting on some surfaces that really jump out - and you're using the flashlight quite a lot in this game. As you up the DLSS modes and output resolution these artifacts diminish, but they're always present to some degree over native res.