Digital Foundry Article Technical Discussion [2024]

Right now there are only 20k players. Dragons Dogma 2 has 10x more. So i think doing just the minimum for a port is not enough after one year.

Just 9 years ago Rise of the Tomb Raider was ported by Nixxes with VXAO. Six years ago Nixxes implemented RT shadows in Shadow of the Tomb Raider. The pattern with Sony ports is obvious:
Has the PS5 version raytracing support? PC gets it, too.
Uses the PS5 version UE4? It maybe gets updated with raytracing.

Every other PS game port has not be updated with Raytracing (God of War, The Uncharted, The Last of Us). Cant be so hard to implement standards like AO, shadows and reflections which are available since 2018/2019...

I really don't think games like this aren't selling gangbusters on the PC because they don't have VXAO or RT. They don't sell in huge amounts at launch because they're 2-3 year old games that have been available at large discounts during that release and between the PC release which is being sold at full price. This is often a stunning looking game even with 'just' SSAO, the rendering tech is not the sales bottleneck here.
 
So curious what is happening with the CPU code. Are they tracking everything? Millions of constant collision detection. Tons of rays being casted for some visibility reeasom?
Would really love to know.
Id wager money on “lots of raycasts” and “designer driven callback/event hell” — the ai behavior is very cool, very expressive, very reactive, and very unlikely to be architected well.
 
I really don't think games like this aren't selling gangbusters on the PC because they don't have VXAO or RT. They don't sell in huge amounts at launch because they're 2-3 year old games that have been available at large discounts during that release and between the PC release which is being sold at full price. This is often a stunning looking game even with 'just' SSAO, the rendering tech is not the sales bottleneck here.
Forbidden West is also a story sequel to Zero Dawn. Meaning you kinda need to have played the first to understand what's going on in the second. And with PC as a secondary market for Playstation games, a whole lot less PC players will have played Zero Dawn than on Playstation. Even if Zero Dawn sold well on PC, we also know that PC gamers have a greater tendency of buying games and then not playing them. :p

I also just dont think Forbidden West is quite as revered as Zero Dawn. And Zero Dawn was one of the earliest PS ports, so people were just more excited for it. PS games coming to PC nowadays isn't quite as notable/special.

All combined, I can see why it's not topping charts at the moment. And will once again demonstrate that port quality/performance or whatever are not what dictate sales. The industry is simply not a meritocracy. I wouldn't be surprised if God of War Ragnarok has a similar issue when it inevitably comes to PC.
 
This game doesnt run optimal on nVidia GPUs: https://www.computerbase.de/2024-03...benchmarks_in_full_hd_mit_reduzierten_details

I wouldnt use these PS first party games as a baseline how good different architecures perform. There is no reason for Sony to optimize their engines for nVidia GPU.

What there is, however, on both the Radeon RX 7900 XTX and the GeForce RTX 4080 Super are constant, small fluctuations in frame pacing. If the frame rate is high enough, these are not a problem, but if the FPS falls below the 60 FPS mark, they could potentially have a negative impact on the feel of the game.

Perhaps what I'm seeing is measurable after all.

I also just dont think Forbidden West is quite as revered as Zero Dawn.

It's not. I thought it was...fine. But definitely not up to the story of the first one and some very odd gameplay changes, or in some ways not enough change.
 
Grabbed it to test on my 3060/12400F, and impressions are...decidedly mixed atm.

1) "Waiting for shader compilation" on the first 5+ loads of the game, regardless if if was just resuming to the same area I was slowly walking around for 30 minutes prior. So getting back into the game was a ~2 minute endeavor for a good handful of reloads.

(Note though I don't think this is related to the performance I detail below, it doesn't appear to be a CPU limitation - the performance is dropping because of GPU.)

2) Medium detail, with perhaps screen space shadows added (disabled as part of preset) + very high textures + lod high (lod medium is brutal), seems to hold 60fps at DLSS performance 1440p as the output resolution...well enough, but it's close - there are a few drops below still, and this isn't even fighting anything. High is out of the question.

Overall in terms of image quality vs the PS5, I'd say DLSS 1440p performance (sharpening set at default 5) looks quite a bit softer compared to PS5 checkerboarding quality in performance mode, both static and in motion. There are elements where DLSS is better perhaps, but overall - nah. DLSS performance at 4k output looks significantly better, not quite PS5's Quality mode but it's prob closer to that than 1440p/performance is to PS5's performance mode.

3) The dynamic res is too fine grained wrt GPU occupancy, constant drops with it enabled as it's trying too hard to keep the GPU busy. Only forcing DLSS performance mode can reduce these.

4) Even with Nividia reflex + boost, seems to have worse controller latency than PS5 (playing with wired XSX). HZD also had this issue. Granted, even on PS5, HW's default controls always feel 'floaty', so perhaps different controllers could explain some of it, the response curve between them usually differs IME*. Without reflex it's of course worse.

*Nah, tested with a wired Dualsense. The latency sucks.

5) With motion blur enabled and fast movement, definitely noticeable shimmering with DLSS and vegetation at 1440p. Less noticeable at 4k output as you would expect, but it's still there. It's not just restricted to motion blur, there is generally just some level of shimmering to varying degrees depending upon the type of vegetation (stripped trees are especially problematic) when you're using DLSS performance, but mb makes it worse.

This is also the kind of thing you won't necessarily pick up on when you're comparing reconstruction methods with very slow pans or walking about. I want to see if artifacting stands out with fast movement, and occasionally it can.

6) The cardinal sin - microstutter. 60fps is not an entirely properly frame paced 60fps, or rather the camera animation isn't properly paced. Slowly pan the camera around and you'll get occasional small little camera skips and hitches at points. Nvidia reflex makes it far worse, but even without they can crop up (and input latency is pretty brutal without). Rivatuner cannot fix this either. Yikes.

This is not something you'll likely pick up on unless you're playing with a controller on a fixed refresh rate display, prob why Death Stranding's controller stutter was missed when the PC port dropped too. Like DS, there doesn't appear to be frame time jumps when these happen, it's a camera animation issue. But it's there.

So some patches are in order, as usual it seems. Yes, it's a low-end system by today's standards and I wasn't expecting performance parity using a 3060, I expected the requisite 'tax' for modern PS5 ports, but atm there's a bigger quality gap than I anticipated. Not as problematic as Spiderman was at launch, but that microstutter and lack of attention given to motion blur and DLSS (especially as it's enabled by default) are pretty big misses imo.

I'm particularly wary of the microstutter issue being fixed, as it's very subtle and I think many, especially m&k players on VRR displays, will miss it.
I'm playing the game with a controller on a Ryzen 7700x and 4090 with a 144hz VRR screen and the microstutter is very noticeable to me, it's not just because you're on a fixed refresh display. It really doesn't feel especially smooth. The latency is ridiculous too, I can hit a basically locked 120 (technically 116 because of reflex) with DLAA + frame gen but the input lag is so bad I couldn't play it like that. Not sure what's up there. I tried it briefly on my old 2080 Ti machine as well and while framerates were mostly in the 55-60 range playing at a mix of medium and high settings at 4k with DLSS + dynamic resolution targeting 60 it didn't feel at all smooth to play, there is a lot of microstutter.

I also completely agree with your analysis of DLSS in this game, something is definitely off there. Maybe it's the nature of the game's visuals, it's a very busy looking game, but anytime you move the camera the image becomes quite grainy and aliased even outputting at 4k. I tried switching to preset F using DLSStweaks and that seemed to help slightly compared to preset C but that could easily be placebo.

The game has all of the components of a great port but these issues are enough that I'm going to wait for a few patches to really get into it.
 
I'm playing the game with a controller on a Ryzen 7700x and 4090 with a 144hz VRR screen and the microstutter is very noticeable to me, it's not just because you're on a fixed refresh display. It really doesn't feel especially smooth. The latency is ridiculous too, I can hit a basically locked 120 (technically 116 because of reflex) with DLAA + frame gen but the input lag is so bad I couldn't play it like that. Not sure what's up there. I tried it briefly on my old 2080 Ti machine as well and while framerates were mostly in the 55-60 range playing at a mix of medium and high settings at 4k with DLSS + dynamic resolution targeting 60 it didn't feel at all smooth to play, there is a lot of microstutter.

Yeah it's definitely noticeable when I jump back into the PS5 version. My fps is 'locked' at 60 but something is...off.

I also completely agree with your analysis of DLSS in this game, something is definitely off there. Maybe it's the nature of the game's visuals, it's a very busy looking game, but anytime you move the camera the image becomes quite grainy and aliased even outputting at 4k. I tried switching to preset F using DLSStweaks and that seemed to help slightly compared to preset C but that could easily be placebo.

Yeah it's a little weird, they are some very grainy effects, in some ways it's almost like a cross how the PS5 was rendering before their big patch which added in a lot more temporal stability vs. after the patch, like the PC is some middle ground - but there's definitely some kind of oversharpened aspect to the artwork/rendering that is still there on the PS5 too. Some effects on the PS5 are much blurrier (like the waterfall mists) but they have less instability.

Something is very wrong with performance there which I can only assume to be DRS not performing as well on PC as on PS5 like you suggest, I.e. the performance lows on PC are at a much lower resolution delta to the norm.

1800p CB is over 3x the base resolution of 1440p DLSS performance (so naturally image quality shouldn't be anywhere near close) so there is absolutely no reason a 3060 shouldn't obliterate PS5 performance at that setting.

EDIT: to add some additional context there, if we were to scale resolution directly with performance, then based on TPU's GPU performance charts, the PS5 would be performing between a 7900XTX and a 4090 🤣

Bear in mind the cost of DLSS here too, but afaik checkerboarding also has a relatively high cost too. From the recommended spec charts nothing really seems to be off with my system in particular though, as Alex discussed in his preview video, those "60 fps" recommendations from that chart were also assuming the use of using dynamic res in order to reach that (!). So Guerilla's recommended specs for a 3060 was 1080p medium.

Without using dynamic res, here's what 1080p medium gets me in this area:

1711154940369.png

1440p Medium:

1711154991666.png

4k, DLSS Performance Medium:

1711155042056.png


The cost of DLSS in this game seems a bit high, but not really that out of the ordinary - 4K DLSS performance matches well to straight 1440p in most of my tests with other games, this is a little more. The performance cost of DLSS is definitely something I'd like to see improved in the next architecture.

We'd have to know what exact res the PS5 is rendering at in performance mode for this same area as they're always using dynamic, but if it's anything remotely close to the 1600x1800p native rendering res of 1800p CBR, then I'm definitely in TLOU rendering deficit territory (esp when you considering the PS5 is likely using higher than medium, and adding in CBR's performance cost!). 😬
 
Last edited:
The game has a mostly good RTGI implementation. Image quality problems on Series X in resolving the checkboarded output. Resolution is hard to determine, fps is between 30 and 40, Series X wins in GPU limited scenarios, while PS5 wins in CPU limited scenarios.

So watching the video, I don't see why on ps5-x it couldn't run at least at an unstable 60 fps outside of cities, by reducing the res to a straight 1080p without reconstruction (from the video it reduces CPU load significantly), removing ray tracing and maybe some slightly lower lod settings.

They could call it a "performance mode" :yes:

The city is unsolvable without developer intervention, but it's just one section of the game.
 
Yeah it's definitely noticeable when I jump back into the PS5 version. My fps is 'locked' at 60 but something is...off.



Yeah it's a little weird, they are some very grainy effects, in some ways it's almost like a cross how the PS5 was rendering before their big patch which added in a lot more temporal stability vs. after the patch, like the PC is some middle ground - but there's definitely some kind of oversharpened aspect to the artwork/rendering that is still there on the PS5 too. Some effects on the PS5 are much blurrier (like the waterfall mists) but they have less instability.



Bear in mind the cost of DLSS here too, but afaik checkerboarding also has a relatively high cost too. From the recommended spec charts nothing really seems to be off with my system in particular though, as Alex discussed in his preview video, those "60 fps" recommendations from that chart were also assuming the use of using dynamic res in order to reach that (!). So Guerilla's recommended specs for a 3060 was 1080p medium.

Without using dynamic res, here's what 1080p medium gets me in this area:

View attachment 11064

1440p Medium:

View attachment 11065

4k, DLSS Performance Medium:

View attachment 11066


The cost of DLSS in this game seems a bit high, but not really that out of the ordinary - 4K DLSS performance matches well to straight 1440p in most of my tests with other games, this is a little more. The performance cost of DLSS is definitely something I'd like to see improved in the next architecture.

We'd have to know what exact res the PS5 is rendering at in performance mode for this same area as they're always using dynamic, but if it's anything remotely close to the 1600x1800p native rendering res of 1800p CBR, then I'm definitely in TLOU rendering deficit territory (esp when you considering the PS5 is likely using higher than medium, and adding in CBR's performance cost!). 😬

The PS5's 1800p CB res is only 39% more base pixels than straight up 1080p so if your 3060 can maintain a solid 60 without DRS at 1080p then that's far more reasonable IMO.

I'd put the PS5 as nominally around 15-20% faster than the 3060 so a 39% resolution deficit + removal of CB overhead and potentially lower settings should make easy work for the 3060 under that scenario. The big uknown though which can completely change the picture is what resolution the PS5 is using at the 3060's stress points. Without that info I fear that any performance comparisons are largely fruitless.

For reference a drop to 1500p CB would be fewer base pixels than 1080p. Do we know what resolution range the PS5 uses?
 
The PS5's 1800p CB res is only 39% more base pixels than straight up 1080p so if your 3060 can maintain a solid 60 without DRS at 1080p then that's far more reasonable IMO

Whoops, wasn't thinking clearly - I think I still had DLSS ratios in my head when making that comparison to the PS5's native CBR res.

I'd put the PS5 as nominally around 15-20% faster than the 3060 so a 39% resolution deficit + removal of CB overhead and potentially lower settings should make easy work for the 3060 under that scenario. The big uknown though which can completely change the picture is what resolution the PS5 is using at the 3060's stress points. Without that info I fear that any performance comparisons are largely fruitless.

For reference a drop to 1500p CB would be fewer base pixels than 1080p. Do we know what resolution range the PS5 uses?

Neither DF video on it, the original or the one addressing the patch (which apparently did increase perf so res doesn't drop as much) don't give any hard numbers, just 1800p max.
 
Whoops, wasn't thinking clearly - I think I still had DLSS ratios in my head when making that comparison to the PS5's native CBR res.



Neither DF video on it, the original or the one addressing the patch (which apparently did increase perf so res doesn't drop as much) don't give any hard numbers, just 1800p max.
It drops res in CBR mode quite often, that is why 1440p with DLSS performance can look anyway similar, because it can be around that res with CBR on while in gameplay pretty often.

If you wanna see how it can dip resolution in that mode on PS5, just watch the opening cutscene, there are a lot of wide shots there which are sub 1800p CBR.

Edit: another thing to consider is the cost of post-processing. HZD did its post-processing pre-checkerboard res, HFW might be the same. DLSS guidance is to do it at output res.
 
Last edited:
The game has a mostly good RTGI implementation. Image quality problems on Series X in resolving the checkboarded output. Resolution is hard to determine, fps is between 30 and 40, Series X wins in GPU limited scenarios, while PS5 wins in CPU limited scenarios.


Is the RTGI probe based? May have missed it but I didn’t hear any technical details on the implementation in the DF video.

I’m far less impressed with the visuals in DD2 than the DF crew seems to be. The GI is ok but nearly everything else including character, monster and environment models look like minor improvements over the first game.
 
I'm playing Forbidden West at native 3440x1440 with max settings and no AA.

You soon get used to the jaggies but the game is crispy AF!

I use RTSS to cap to 60fps and I've not felt that the input lag or response is bad.
 
Is the RTGI probe based? May have missed it but I didn’t hear any technical details on the implementation in the DF video.
CapCom detailed their GI methodology in this presentation here, they introduced some form light importance sampling to their GI as well.


On another note, looks like DLSS3 files are there in Dragon Dogma 2, but CapCom has hidden it temporarily as it caused an unspecified behavior, users can enable it easily using a simple mod.

 

Looking at this video of a PC with a 4060 and a 3600, the game when outside camps or cities is above 60 fps at 1080p without raytracing. Of course when he enters the camp the frame rate drops, but I think this is a compromise that most users would accept. A PS5 in this game offers performance in line with the 3600, so developers should offer something similar for users that want higher frame rates.

Ps: all this applies to series x too, just a little more unstable.
 

Looking at this video of a PC with a 4060 and a 3600, the game when outside camps or cities is above 60 fps at 1080p without raytracing. Of course when he enters the camp the frame rate drops, but I think this is a compromise that most users would accept. A PS5 in this game offers performance in line with the 3600, so developers should offer something similar for users that want higher frame rates.

Ps: all this applies to series x too, just a little more unstable.

The desktop 3600 can be 40-70% faster than the CPU in the consoles due to extra L3 cache (Tested by Digital Foundry)

So the consoles will never have the same frame rate experience as a PC with a 3600.
 
The desktop 3600 can be 40-70% faster than the CPU in the consoles due to extra L3 cache (Tested by Digital Foundry)

So the consoles will never have the same frame rate experience as a PC with a 3600.
The performance section of the digital foundry video shows the 3600 being 10% faster than the series x, so a PS5 being 3-5 fps faster than the series x puts it at about 3600 performance. Of course the settings used aren't exactly the same, but both are 4k interlaced and both have rtgi enabled, so they are very similar.

Also I don't know where you got that a 3600 is 40 to 70% faster than the console, I would like a source 🙄
 
The performance section of the digital foundry video shows the 3600 being 10% faster than the series x, so a PS5 being 3-5 fps faster than the series x puts it at about 3600 performance. Of course the settings used aren't exactly the same, but both are 4k interlaced and both have rtgi enabled, so they are very similar.

It won't be at 3600 performance, the 3600 is just a faster CPU and no amount of hopes or dreams will change that.

Also I don't know where you got that a 3600 is 40 to 70% faster than the console, I would like a source 🙄

You were given the name of the source, but as you're struggling, let me throw you a bone.

The AMD 4800S is the console CPU (same clocks and same amount of L3 cache)

But even with two extra cores it gets hammered by the Ryzen 5 3600.

The 3600 is 47% faster in CP2077

GI_Cq2fW4AAxjQR.jpg

And a whopping 73% faster than the console CPU in Metro Exodus.

GI_EA0DW0AAzxcb.jpg

The console CPU's have a pitiful amount of L3 cache and it kills their performance, they're no better than a Ryzen 1800x.
 
I thought this forum was above shit talking, but anyways...

Can you play actual CPU limited titles with a 4700s at PS5 performance?
Let's take warzone 2, or Apex legends, or even Fortnite.
All have 120 fps modes on PS5, and they are all 99% locked to the target.


Even a 3600 fails to even get close to 120 fps in warzone 2, and this is with a 4070 super, so we can be sure it's a CPU bottleneck, so you can imagine how a 1800x would fare here. I had a 1800x years ago, I was really good, it just didn't really perform like a PS5.

We also have the actual video from digital foundry, which I don't think you have watched, comparing the 3600 to the series x.
It's at 17:02, so that you don't have a hard time searching for it.

About the 4700s video, it's a academic exercise that isn't representative of the real world performance of that CPU, especially since it's a PC environment vs a console environment.
 
I thought this forum was above shit talking, but anyways...

Can you play actual CPU limited titles with a 4700s at PS5 performance?
Let's take warzone 2, or Apex legends, or even Fortnite.
All have 120 fps modes on PS5, and they are all 99% locked to the target.

Even a 3600 fails to even get close to 120 fps in warzone 2, and this is with a 4070 super, so we can be sure it's a CPU bottleneck, so you can imagine how a 1800x would fare here. I had a 1800x years ago, I was really good, it just didn't really perform like a PS5.

We also have the actual video from digital foundry, which I don't think you have watched, comparing the 3600 to the series x.
It's at 17:02, so that you don't have a hard time searching for it.

About the 4700s video, it's a academic exercise that isn't representative of the real world performance of that CPU, especially since it's a PC environment vs a console environment.

Can we have a video showing matched settings and not from some random YouTube channel?
 
Back
Top