Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Outside of some small 1fps drops in towns on the PS5, the game is pretty much locked at 60fps on consoles from all reports I've seen. Due to the isometric design and quality setting (likely balanced here) used, FSR2 gets very close to native 4K in final appearance. For most console players they'll get a very solid performing, 4k-like experience. The vast majority would have no idea what the base rendering res was, as that's the entire point of reconstruction.

Amazing? No, but not remotely comparable to the state of many console->PC ports. God I wish most PC ports were this 'poor'.
Am I supposed to be impressed that this average looking game runs at 60 fps when better looking games run at higher resolutions at the same 60 fps or better?
No, it can't. The 1% lows on the 3050 from TomsHardware charts are just over 50fps, that would give you a hell of a lot more framedrops than we've seeing from the console versions. Hell even .1% lows of 50fps would be less consistent that what I've seen on consoles, never mind 1% - remember that's an average of the 1% lows. It means it could be dropping well below 50fps at points too.

To be consistently over 60fps for the 1% lows at 1440p, you need a 2070/3060ti. Yes, the 3060ti would be considerably over 60fps - but as has been explained numerous times, reconstruction has a cost. FSR2 starting from ~1300p will very likely be more costly than native 1440p, probably significantly so.*

All this says is that Blizzard prioritized a stable framerate, as they should - especially when FSR2 works so well here. The actual comparable performance to the PC and you're getting something in between a 3060/3060ti. That is not exceptionally poorly optimized at all, it's nothing like say needing a 3080ti to equal PS5 performance in TLOU.

*Edit: Here, check this out. Just did this test:

God of War, mostly console settings, 4k with FSR Performance (so native 1080p): 67fps

Native 1440p: 71fps

1080p with FSR2 is more costly than native 1440p. If you wanted higher res than 1440p with simple bilinear upscaling, you very likely could have gotten it, probably at least 1620p or higher. Blizzard just felt FSR2 was the right choice here, and I'd say they look to be correct (and if they did just use regular scaling, you would hear countless complaints about how they didn't 'even bother' with reconstruction, just as Naughty Dog has received).
Yea but the console version is using FSR 2? If I wanted a more stable framerate on the 3050, I'd just turn on DLSS2 and get a similar outcome to consoles? Issue solved.. Again it's good that Blizzard made sure the framerate was a semi locked 60 but, the game is so visually unimpressive that more is expected. I guess that's where we disagree. Some people believe that the game looks good but, I'm not one of those people. Looking at the environments, the highly controlled camera, etc, it's a shame this is the best they could do....
 
It's minimum CPU requirement on PC is the now 12yr old i5 2500k so I suspect it's not that demanding.
I think it may actually be a bit more cpu heavy than we think, I saw a video of a 3600x having trouble keeping a 2070 at full load at 1080p. We are talking sitting between 50-70% utilization. I'll try go find the video.
 
Another 'oddity' with D4's RAM use that it uses virtual textures for some of the scenery. That should be low/fix use. Maybe everything that's not static is a memory hog.

Not that I'm bothered. The Deck is great for this.
 
Last edited:
I think it may actually be a bit more cpu heavy than we think, I saw a video of a 3600x having trouble keeping a 2070 at full load at 1080p. We are talking sitting between 50-70% utilization. I'll try go find the video.

"Diablo 4 does not require a high-end CPU for gaming at over 100fps. By simulating only two cores (with SMT enabled), we were able to run the game with a minimum of 150fps at 1080p/Ultra Settings. So, if you own an old CPU, you’ll be completely fine and you’ll be able to run it."

Link
 
"Diablo 4 does not require a high-end CPU for gaming at over 100fps. By simulating only two cores (with SMT enabled), we were able to run the game with a minimum of 150fps at 1080p/Ultra Settings. So, if you own an old CPU, you’ll be completely fine and you’ll be able to run it."

Link
Well your right and i'm wrong, the video I saw had it locked at 75fps and while it was dropping into the 60's and load was not going above 70% it was also from the beta 2 month ago so i'm gonna take the L on this one.

I will say though that using a 7950x3d and disabling some cores/ccu isn't a real great way to emulate a lower end cpu because your still going to have the extra cache/frequency/power draw a real lower end cpu wont have.
 
The current state of troubled PC ports.

The Callisto Protocol: remains bad, stuttering is fixed, CPU performance is improved by 15%, but the single threaded nature of the game remains the same.

Dead Space Remake: still bad, VRAM issues are fixed, but shader stutters and traversal stutters remain horrendous.

Forspoken: good now, VRAM issues are fixed, CPU performance improved by 30%, and SSAO issues are fixed.

 
Last edited:
Am I supposed to be impressed that this average looking game runs at 60 fps when better looking games run at higher resolutions at the same 60 fps or better?

Fine, but I'm not addressing what you're 'impressed' by or not, that's entirely subjective. I'm just addressing your outrage at the supposed 'low' native resolution on consoles, as it's based on flawed reasoning. It's performing very similarly to what most other games perform like on equivalent PC hardware. You can feel it's not technically brilliant, fine - but you're making direct comparisons to the PC version to imply the porting process on consoles is fundamentally flawed. It doesn't appear to be.

Yea but the console version is using FSR 2? If I wanted a more stable framerate on the 3050, I'd just turn on DLSS2 and get a similar outcome to consoles? Issue solved..

Yes, the consoles are using FSR2*. The point was the 3050 is not a console equivalent. It's running at a non-reconstructed resolution that has a lower (likely significantly) rendering cost than FSR from 1300p, and it's still dropping 1% lows in the 50's, which means it's dropping even lower. If you enabled FSR2 on a 3050 in say performance mode, so 1080p native and 4k output - it could actually end up running slower than native 1440p. So no, you would not get a 'similar outcome'. You'd get a worse performing, and worse looking version - not to mention potential awful texture stuttering.

Based on the goal of keeping those 1% lows as close to 60fps as possible, and a rendering load equivalent to ~1300p FSR2, you're going to need something in between a 3060 and 3060ti. Which basically covers the equivalent performance profile of the vast majority of console/PC games. It just isn't an outlier like you suggest.

(*If you're implying that DLSS2 has a significantly less performance penalty than FSR2, that is also not really the case. It can be less in some games, but we're talking single-digit percentages. Irrelevant either way as console devs have to make use of the technologies that are available to consoles, DLSS isn't one of them.)
 
Last edited:
"Diablo 4 does not require a high-end CPU for gaming at over 100fps. By simulating only two cores (with SMT enabled), we were able to run the game with a minimum of 150fps at 1080p/Ultra Settings. So, if you own an old CPU, you’ll be completely fine and you’ll be able to run it."

Link
"
Diablo 4 does not feature any built-in benchmark tool. Thus, for our tests, we used the Tavern area in Kyovashad. This area appeared to be the most demanding in that big city. Do note, however, that the game can display a lot of enemies on screen. Unfortunately, we could not benchmark such a scenario as this happens randomly. So yeah, be sure to keep that in mind.

"
Seriously? it's nothing.
Tavern? in a non combat area LOL.

If you guys have some testing software, I'll do it for you.
I've got a 3070 and a 3900x.

I know how to plug up a big train and slam it down.
 
The current state of troubled PC ports.

The Callisto Protocol: remains bad, stuttering is fixed, CPU performance is improved by 15%, but the single threaded nature of the game remains the same.

Dead Space Remake: still bad, VRAM issues are fixed, but shader stutters and traversal stutters remain horrendous.

Forspoken: good now, VRAM issues are fixed, CPU performance improved by 30%, and SSAO issues are fixed.


Great video @Dictator, lot of work to cover so many games at once.

Love to see these as while the impetus for publishers to give devs the time/resources to fix these problems is far less months after launch, any added media pressure helps. I'm still as pessimistic at something like Dead Space being addressed as I was at launch (well, far more now really) but it's still good to let potential purchasers know the current status. I'm not expecting significant change, but it's good to indicate to publishers they can't just ride out the initial wave of negative PR.

One missing critique, at least based on my recent playthrough with the 2 hour trial of Dead Space, was that the wrong lodbias for DLSS/FSR was still not addressed, so you're getting far worse texture quality with reconstruction enabled, which most systems will likely be using considering the high rendering load of the game. That is very likely an extremely quick problem to address, that they still haven't done so is pretty pathetic. Yes, on Nvidia you can 'kind of' fix it with NvInspector, but that fucks up some of the holographic interface effects, and it's not available to Radeon users.

Also good to see attention given to my nemesis - artifacting when low-res buffer effects like dof not being properly integrated with reconstruction being called out in Returnal. :) This is very similar to what happens with the Resident Evil DLSS upscale mod, it really should not be occuring in a shipped game where the devs have complete control over render targets, and especially not being introduced after a patch. It's pretty egregious.

Good to see Forspoken make some headway with this 'sparkle' DLSS problem btw when using motion blur (and dof with XESS) - so many older games fuck up DLSS with motion blur enabled, I suspect the default action of many PC gamers disabling camera motion blur from the outset masked this as a problem early on. Luckily more games these days factor it in.

1686246333010.png
 
Last edited:
Again, 1% lows are not adequate to capture stutters! Games measured with frametimes is what I want to see more of when the discussion of 8GB cards comes up, there may be games where it manifests in visible stutters but otherwise barely effects average/1% lows.

That's why I use CapFrameX now as my overlay as it shows stuttering as a percentage as well as a graph.
 
"
Diablo 4 does not feature any built-in benchmark tool. Thus, for our tests, we used the Tavern area in Kyovashad. This area appeared to be the most demanding in that big city. Do note, however, that the game can display a lot of enemies on screen. Unfortunately, we could not benchmark such a scenario as this happens randomly. So yeah, be sure to keep that in mind.

"
Seriously? it's nothing.
Tavern? in a non combat area LOL.

If you guys have some testing software, I'll do it for you.
I've got a 3070 and a 3900x.

I know how to plug up a big train and slam it down.

Still high??

Untitled.png
 
I don’t think anyone benchmarked a raid. Or like 16 necromancies with all their pets spamming abilities.

And you’re getting me wrong here, these graphs don’t interpret the challenge that consoles have.
PCs have their own memory pool for CPU. It doesn’t matter how much data they are modifying per second, there will always be sufficient memory bandwidth for CPU to do its work.

Once you start cutting that memory out of GPU bandwidth, that’s a little different. You may want to compare consoles against integrated chipsets to get a better idea of cpu eating bandwidth is causing a problem.
 
I don’t think anyone benchmarked a raid. Or like 16 necromancies with all their pets spamming abilities.

And you’re getting me wrong here, these graphs don’t interpret the challenge that consoles have.
PCs have their own memory pool for CPU. It doesn’t matter how much data they are modifying per second, there will always be sufficient memory bandwidth for CPU to do its work.

Once you start cutting that memory out of GPU bandwidth, that’s a little different. You may want to compare consoles against integrated chipsets to get a better idea of cpu eating bandwidth is causing a problem.

There's footage of the ostensibly 30 fps PS4 version stuttering down towards 0fps with only a single player and not a ton happening on screen. This is from Eurogamer:

PS4-2.jpg


Whether this is down to CPU or not, or streaming, or lack of optimisation, I don't know. But if it is in some way related to CPU it's not hard to imagine scenarios where a raid ballin' through terrain with tons of enemies and stuff going down might be bottlenecked by a PC CPU.

... maybe?
 
Just a commentary regarding memory and Diablo 4.

From the beta my experience was the game would actually crash after a short while with a memory error with 16GB system ram and a limited pagefile size ( I believe 4096MB) on a GTX 970 at 1080p. This was on medium(? can't remember exactly, I didn't touch the settings).
 
Yea but the console version is using FSR 2? If I wanted a more stable framerate on the 3050, I'd just turn on DLSS2 and get a similar outcome to consoles? Issue solved.

Consoles don't have Nvidia's tensor cores or a DLSS plugin. How were Blizzard supposed to add tensor cores and DLSS to the consoles for Diablo 4?

Why is the 3050 having Nvidia's DLSS relevant to Blizzard's technical achievements on console?
 
That's a really long winded way to say "lazy devs".

It's hard to believe that games aren't doing basic residency management. No platform has unlimited VRAM.

This is like when Mary and Joseph couldn't find room in any of the Inns in Bethlehem.

Textures are going to be forced to stay in the stables (main ram).
 
Status
Not open for further replies.
Back
Top