Ok, had to check it out myself for brief tech impressions on a midrange system - i5-12400f, 3060 12GB, 32GB DDR4 3200.
First off, I'm wondering if the Zen2 performance bottleneck extends beyond the gameplay. On my system from a clean cache, the entire PSO compilation process took just under 5 minutes. They're both 6 core, 12 thread CPU's, and sure the 12400F is indeed a fair bit zippier than the 3600, but the 3600 in DF's video took 10 minutes! I certainly haven't seen a 2X performance bump for a 12400f vs 3600 reflected in any other benchmark.
Also while I can't recall if it was mentioned in DF's review (it's probably assumed as most Sony ports thankfully operate this way) but just to be clear yes, the shader optimization notification is just that - a notifier, not a blocker. You can jump right into the game and it will compile in the background on low priority threads. In fact on my first run before I wiped my shader cache to time it in full, I said "fuck it" and started the game at 20% shader compilation. I then went through about 3 minutes of the opening sequence, nary a blip. Now this doesn't mean if you're halfway through the game, get a driver update and jump right back into it with no waiting you won't get shader stuttering as perhaps the process is geared towards compiling assets for the starting area first, but at least in my brief test in the opening sequence it was not a bottleneck at all in maintaining 60fps - hell I'm barely over 50% CPU.
I didn't take much time tweaking the settings before getting the framerates of the opening sequence chase seen (after a fully compiled cache this time, but again it seemed to make little difference), just to see if I could get 4k 60 with DLSS Performance, so was shooting for some comfortable headroom at the opening scene. These are the settings I went with:
(
BTW of note: A quirk was discovered though (I've seen this with a number of DX12 games) - changing the graphics settings may degrade your performance until you restart the game. I was futzing around with high/ultra/medium, and putting everything back to those above after that, I couldn't get above 90% GPU utilization and performance was just over 50fps. Reloading the game fixed it, so be aware.)
(Also I really like the settings menu showing PSO compilation/VRAM status in the corner)
With those settings, during the entire opening chase sequence of 5+ minutes, maybe had 3 little 'blips' - not stutter, just moments where the frametime was ever so briefly interrupted that RT's default polling frequency couldn't even catch it as a frame drop, I just saw a slight tick on the graph. GPU% was usually around 80-85%, the highest it got was getting near the fur of the wolves, at one brief camera pan I did get a drop to 58fps. Otherwise seemingly plenty of headroom for my settings and 4k DLSS/medium/high, albeit again this is just the opening sequence. So at least with this early look, this doesn't seem to be a title that's
exorbitantly GPU demanding, or at least it can scale well. I certainly can't get 4k DLSS Perf @ 60fps with Horizon Forbidden West or Ghost of Tsushima without going to the absolute lowest settings, and even that's not enough for Forbidden West.
A familiar youtube comment on DF's video seems to be "lol he says it's a great port but then spends 20 minutes pointing out the flaws". I mean granted, youtube so what do you expect, but from my experience the point of Alex's review was solid - the
fundamentals of this title seem to be quite strong, and in the case of shader stutter, may be even superlative. There are some definite quirks but they do seem to be that -
specific areas with specific weird stuff that definitely needs to be brough to attention, but this is not a case where the
foundation of the engine's PC version seems built on shaky ground as it can be with other ports.
(Oh btw, all of these tests were run with the game installed to a 5400 rpm HDD.
)