Digital Foundry Article Technical Discussion [2024]

Could likely be a bug. It's fast now, because I've prob been in-game for ~2 hours with the testing. When I was getting those constantly compiling messages I was exiting out after being in-game for 20 minutes or so, so who knows - maybe exiting with alt-F4 corrupted the cache somehow. Definitely not how Zero Dawn worked, you would only get longer loads with compiling if you were loading directly into an area for the first time after a new driver install, if you quit and reloaded that same area it would be as fast as expected, then the game would churn away at the shaders in background threads for the rest of the map while you played.
Yea. That's weird. You definitely shouldn't have been seeing it compile shaders again for loading up saves into areas you've already been. Driver-side cache should take care of it.

Don't get me wrong, I love the way Nixxes handles shader compilation.. as it provides as little initial friction as possible to the player.. but I still stand by my opinion that an option in the menu to fully compile all shaders upfront would be the best. Have the ability to do it upfront.. that way loading times could be potentially shorter, and less stress on the CPU during gameplay for those with lower end hardware.
 
Don't get me wrong, I love the way Nixxes handles shader compilation.. as it provides as little initial friction as possible to the player.. but I still stand by my opinion that an option in the menu to fully compile all shaders upfront would be the best. Have the ability to do it upfront.. that way loading times could be potentially shorter, and less stress on the CPU during gameplay for those with lower end hardware.

Yeah, I liked how HZD handled it. Just have a % progress indicator on the main menu churning away, up to the player if they want to wait or jump right in. If you're confident on how well threaded your implementation is, no need to even 'warn' the player like in TLOU that entering the game without a fully compiled cache can have performance consequences, just load the game normally and asynchronously compute away.
 
Last edited:
* Bear in mind 4k DLSS performance, with a native input res of 1080p, is still quite a bit below 1800p CBR in native pixel count (2,073,600 vs 2,880,000 [1600x1800 cbr native]), and motion blur handling aside, when jumping back and forth between the two, no contest - DLSS is far superior at that res/setting.1440p DLSS performance with a native res of 720p has around a third of the native pixels of 1800p CBR, so no wonder. DLSS is very good, but it's not magic.

To that end, here's a 3-way imgsli comparison between PC 4K DLSS, and PS5 Perf/Quality modes. Wish I could have lined up the PC and PS5 shots a little better but you get the idea.

That site is pretty slow btw, so give it time to load the images.

Gotta say btw after playing it a bit more, Nixxes are truly ridiculous with how far ahead they seem to be over other development teams in wrangling DX12. I've prob played for 3 hours now, and during that time I have yet to see the slightest frametime blip that either wasn't just GPU resources being overextended, or the deliberate pauses that are put in for cutscene camera cuts. Like literally - not. a. single. one. Hell, I'm still mostly GPU limited if even if I put it on 720p with DLSS perf. This is on a 12400F, DDR4 setup - never dropping below 100fps, CPU rarely cracks 50% overall, and the frametime graph is just amazingly consistent. I mean I probably have seen the occasional spike more often in Doom Eternal for pete's sake.

Jumping back and forth between Windows and Linux recently has made me even further appreciate the problem of shader stuttering. Steam's fossilize system where the shared Vulkan caches are distributed can make a huge difference in some games (hello Borderlands 3 with your 6GB of shader cache!), as well as engaging the dxvk hud to show realtime shader compile events during gameplay just further reinforce how often those little blips on the frametime graph I've often seen over the years can, in fact, be due to shader compiling. Yes, not all stutters are shader related, and the games Nixxes ports already have well streamlined engines for the games they're designed for, but all that would be for naught if they didn't spend so much time crafting systems to handle the huge amount of PSO's these games demand.
 
Last edited:
open the flood gates of Aloy mods !
Any mod that greatly reduces the chattiness of Aloy, outside of cutscenes/conversations, should be a priority. What's up with main characters in games being unable to keep their mouth closed for a few minutes.
 
To that end, here's a 3-way imgsli comparison between PC 4K DLSS, and PS5 Perf/Quality modes. Wish I could have lined up the PC and PS5 shots a little better but you get the idea.

That site is pretty slow btw, so give it time to load the images.

Gotta say btw after playing it a bit more, Nixxes are truly ridiculous with how far ahead they seem to be over other development teams in wrangling DX12. I've prob played for 3 hours now, and during that time I have yet to see the slightest frametime blip that either wasn't just GPU resources being overextended, or the deliberate pauses that are put in for cutscene camera cuts. Like literally - not. a. single. one. Hell, I'm still mostly GPU limited if even if I put it on 720p with DLSS perf. This is on a 12400F, DDR4 setup - never dropping below 100fps, CPU rarely cracks 50% overall, and the frametime graph is just amazingly consistent. I mean I probably have seen the occasional spike more often in Doom Eternal for pete's sake.

Jumping back and forth between Windows and Linux recently has made me even further appreciate the problem of shader stuttering. Steam's fossilize system where the shared Vulkan caches are distributed can make a huge difference in some games (hello Borderlands 3 with your 6GB of shader cache!), as well as engaging the dxvk hud to show realtime shader compile events during gameplay just further reinforce how often those little blips on the frametime graph I've often seen over the years can, in fact, be due to shader compiling. Yes, not all stutters are shader related, and the games Nixxes ports already have well streamlined engines for the games they're designed for, but all that would be for naught if they didn't spend so much time crafting systems to handle the huge amount of PSO's these games demand.
Thanks for your excellent posts on the game. I am writing this on the phone, so slow as heck, but just wanted to point that out. My idea is to get the game some time in the future, Nixxes makes excellent ports btw. Also one of my preferred screens is of fixed rate and while I think VRR is one of the best tech of the last 20 years and should be universal, it is not always possible.

What is the SX controller you talk about in one of the posts?
 
Great DF interview with Nixxes about their PC port of Horizon Foribidden West. Talking about CPU compression, PSOs, early PC versions and the involvement of Guerrilla and Nixxes in the project.

 
Last edited:
2) Medium detail, with perhaps screen space shadows added (disabled as part of preset) + very high textures + lod high (lod medium is brutal), seems to hold 60fps at DLSS performance 1440p as the output resolution...well enough, but it's close - there are a few drops below still, and this isn't even fighting anything. High is out of the question.

Overall in terms of image quality vs the PS5, I'd say DLSS 1440p performance (sharpening set at default 5) looks quite a bit softer compared to PS5 checkerboarding quality in performance mode, both static and in motion. There are elements where DLSS is better perhaps, but overall - nah. DLSS performance at 4k output looks significantly better, not quite PS5's Quality mode but it's prob closer to that than 1440p/performance is to PS5's performance mode.

3) The dynamic res is too fine grained wrt GPU occupancy, constant drops with it enabled as it's trying too hard to keep the GPU busy. Only forcing DLSS performance mode can reduce these.

Something is very wrong with performance there which I can only assume to be DRS not performing as well on PC as on PS5 like you suggest, I.e. the performance lows on PC are at a much lower resolution delta to the norm.

1800p CB is over 3x the base resolution of 1440p DLSS performance (so naturally image quality shouldn't be anywhere near close) so there is absolutely no reason a 3060 shouldn't obliterate PS5 performance at that setting.

EDIT: to add some additional context there, if we were to scale resolution directly with performance, then based on TPU's GPU performance charts, the PS5 would be performing between a 7900XTX and a 4090 🤣
 
Have you seen Dragons Dogma 2? I dont think a "proper" port is necessary to sell your game...
The better the games runs, the more it'll sell, by and large. Otherwise every dev would throw out 20 fps titles and not bother about optimising if there are no benefits from that optimisation effort.
 
Right now there are only 20k players. Dragons Dogma 2 has 10x more. So i think doing just the minimum for a port is not enough after one year.

Just 9 years ago Rise of the Tomb Raider was ported by Nixxes with VXAO. Six years ago Nixxes implemented RT shadows in Shadow of the Tomb Raider. The pattern with Sony ports is obvious:
Has the PS5 version raytracing support? PC gets it, too.
Uses the PS5 version UE4? It maybe gets updated with raytracing.

Every other PS game port has not be updated with Raytracing (God of War, The Uncharted, The Last of Us). Cant be so hard to implement standards like AO, shadows and reflections which are available since 2018/2019...
 
Great DF interview with Nixxes about their PC port of Horizon Foribidden West. Talking about CPU compression, PSOs, early PC versions and the involvement of Guerrilla and Nixxes in the project.

didn't know that Alex doesn't use a 32:9 ultrawide screen. Still you can easily achieve that with CRU creating a custom resolution like I did on my 2560x1440p monitor. I created a 2560x720p custom resolution and it works. Obviously for a 32" display the horizontal viewport can be small, but you can see the potential.

My TV does have 32:9 support built-in, using Game Mode on a PC. And on the 50" screen it looks really good. Even so, I created the custom 3840x1080p resolution using CRU for the TV 'cos in PC mode you can't use the Movie setting.

Additionally, this quote from the interview is very interesting, and it's great how easy it is on consoles or on the Steam Deck but not on a regular PC.

Are you happy with that PSO collection process for Forbidden West on PC - could automation be used as well? And how do you feel about the whole PSO issue for Windows gaming in general?

Michiel Roza - Nixxes
: We do have an automated collection, which doesn't catch everything, but does make sure the game is in a playable state and QA won't run into a stutter fest every time we change the shaders... I think PSOs are not a great solution. It's the best we have, but I think something else can be done here. I really like the way Valve is handling it on the Steam Deck... and I wonder if that can be extended to PC as well.

Jeroen Krebbers - Guerrilla: Yeah, it would need a lot of collaboration between Microsoft, hardware manufaacturers and software developers to come to an understanding of how this should work. I've jokingly said like "why do we have GPUs that are so fast and yet we can't compile a shader on it?"
 
Right now there are only 20k players. Dragons Dogma 2 has 10x more. So i think doing just the minimum for a port is not enough after one year.

Just 9 years ago Rise of the Tomb Raider was ported by Nixxes with VXAO. Six years ago Nixxes implemented RT shadows in Shadow of the Tomb Raider. The pattern with Sony ports is obvious:
Has the PS5 version raytracing support? PC gets it, too.
Uses the PS5 version UE4? It maybe gets updated with raytracing.

Every other PS game port has not be updated with Raytracing (God of War, The Uncharted, The Last of Us). Cant be so hard to implement standards like AO, shadows and reflections which are available since 2018/2019...
The PC version uses SSAO, not even HBAO+. Even Days Gone had a considerably upgraded Ambient Occlusion.
 
Last edited:
I don't think Nixxes can affort to do many PC specific enhancements, they have their hands full right now porting most Sony exclusives, after Spider Man Remastered they went straight into Miles Morales, then Ratchet then Horizon then Ghost of Tsushima, then they have Spider Man 2 on their hands, and who knows if Last of Us 2 or God of War Ragnarock are theirs or not?

They are most definitely time constrained.
 
The game has a mostly good RTGI implementation. Image quality problems on Series X in resolving the checkboarded output. Resolution is hard to determine, fps is between 30 and 40, Series X wins in GPU limited scenarios, while PS5 wins in CPU limited scenarios.

 
Last edited:
It'd be nice if the DLSS veg blurring and promised-but-not-present RT reflections are sorted in a later patch. RT shadows would be nice too. I noticed tree shadow pop-in almost immediately and it's a bit distracting.
 
The game has a mostly good RTGI implementation. Image quality problems on Series X in resolving the checkboarded output. Resolution is hard to determine, fps is between 30 and 40, Series X wins in GPU limited scenarios, while PS5 wins in CPU limited scenarios.

So curious what is happening with the CPU code. Are they tracking everything? Millions of constant collision detection. Tons of rays being casted for some visibility reeasom?
Would really love to know.
 
So curious what is happening with the CPU code. Are they tracking everything? Millions of constant collision detection. Tons of rays being casted for some visibility reeasom?
Would really love to know.
Sadly, none of that, it's NPCs logic and scheduling, that's why NPCs fade in and out so close to the player in an effort to eleminiate their CPU burden.
 
Back
Top