DavidGraham
Veteran
Regarding the recurring VRAM debate of Diablo 4.
Honestly I suspect this is a fairly heavy CPU game
Am I supposed to be impressed that this average looking game runs at 60 fps when better looking games run at higher resolutions at the same 60 fps or better?Outside of some small 1fps drops in towns on the PS5, the game is pretty much locked at 60fps on consoles from all reports I've seen. Due to the isometric design and quality setting (likely balanced here) used, FSR2 gets very close to native 4K in final appearance. For most console players they'll get a very solid performing, 4k-like experience. The vast majority would have no idea what the base rendering res was, as that's the entire point of reconstruction.
Amazing? No, but not remotely comparable to the state of many console->PC ports. God I wish most PC ports were this 'poor'.
Yea but the console version is using FSR 2? If I wanted a more stable framerate on the 3050, I'd just turn on DLSS2 and get a similar outcome to consoles? Issue solved.. Again it's good that Blizzard made sure the framerate was a semi locked 60 but, the game is so visually unimpressive that more is expected. I guess that's where we disagree. Some people believe that the game looks good but, I'm not one of those people. Looking at the environments, the highly controlled camera, etc, it's a shame this is the best they could do....No, it can't. The 1% lows on the 3050 from TomsHardware charts are just over 50fps, that would give you a hell of a lot more framedrops than we've seeing from the console versions. Hell even .1% lows of 50fps would be less consistent that what I've seen on consoles, never mind 1% - remember that's an average of the 1% lows. It means it could be dropping well below 50fps at points too.
To be consistently over 60fps for the 1% lows at 1440p, you need a 2070/3060ti. Yes, the 3060ti would be considerably over 60fps - but as has been explained numerous times, reconstruction has a cost. FSR2 starting from ~1300p will very likely be more costly than native 1440p, probably significantly so.*
All this says is that Blizzard prioritized a stable framerate, as they should - especially when FSR2 works so well here. The actual comparable performance to the PC and you're getting something in between a 3060/3060ti. That is not exceptionally poorly optimized at all, it's nothing like say needing a 3080ti to equal PS5 performance in TLOU.
*Edit: Here, check this out. Just did this test:
God of War, mostly console settings, 4k with FSR Performance (so native 1080p): 67fps
Native 1440p: 71fps
1080p with FSR2 is more costly than native 1440p. If you wanted higher res than 1440p with simple bilinear upscaling, you very likely could have gotten it, probably at least 1620p or higher. Blizzard just felt FSR2 was the right choice here, and I'd say they look to be correct (and if they did just use regular scaling, you would hear countless complaints about how they didn't 'even bother' with reconstruction, just as Naughty Dog has received).
I think it may actually be a bit more cpu heavy than we think, I saw a video of a 3600x having trouble keeping a 2070 at full load at 1080p. We are talking sitting between 50-70% utilization. I'll try go find the video.It's minimum CPU requirement on PC is the now 12yr old i5 2500k so I suspect it's not that demanding.
I think it may actually be a bit more cpu heavy than we think, I saw a video of a 3600x having trouble keeping a 2070 at full load at 1080p. We are talking sitting between 50-70% utilization. I'll try go find the video.
Well your right and i'm wrong, the video I saw had it locked at 75fps and while it was dropping into the 60's and load was not going above 70% it was also from the beta 2 month ago so i'm gonna take the L on this one."Diablo 4 does not require a high-end CPU for gaming at over 100fps. By simulating only two cores (with SMT enabled), we were able to run the game with a minimum of 150fps at 1080p/Ultra Settings. So, if you own an old CPU, you’ll be completely fine and you’ll be able to run it."
Link
Am I supposed to be impressed that this average looking game runs at 60 fps when better looking games run at higher resolutions at the same 60 fps or better?
Yea but the console version is using FSR 2? If I wanted a more stable framerate on the 3050, I'd just turn on DLSS2 and get a similar outcome to consoles? Issue solved..
""Diablo 4 does not require a high-end CPU for gaming at over 100fps. By simulating only two cores (with SMT enabled), we were able to run the game with a minimum of 150fps at 1080p/Ultra Settings. So, if you own an old CPU, you’ll be completely fine and you’ll be able to run it."
Link
The current state of troubled PC ports.
The Callisto Protocol: remains bad, stuttering is fixed, CPU performance is improved by 15%, but the single threaded nature of the game remains the same.
Dead Space Remake: still bad, VRAM issues are fixed, but shader stutters and traversal stutters remain horrendous.
Forspoken: good now, VRAM issues are fixed, CPU performance improved by 30%, and SSAO issues are fixed.
Seriously? it's nothing.
Tavern? in a non combat area LOL.
Again, 1% lows are not adequate to capture stutters! Games measured with frametimes is what I want to see more of when the discussion of 8GB cards comes up, there may be games where it manifests in visible stutters but otherwise barely effects average/1% lows.
"
Diablo 4 does not feature any built-in benchmark tool. Thus, for our tests, we used the Tavern area in Kyovashad. This area appeared to be the most demanding in that big city. Do note, however, that the game can display a lot of enemies on screen. Unfortunately, we could not benchmark such a scenario as this happens randomly. So yeah, be sure to keep that in mind.
"
Seriously? it's nothing.
Tavern? in a non combat area LOL.
If you guys have some testing software, I'll do it for you.
I've got a 3070 and a 3900x.
I know how to plug up a big train and slam it down.
I don’t think anyone benchmarked a raid. Or like 16 necromancies with all their pets spamming abilities.
I don’t think anyone benchmarked a raid. Or like 16 necromancies with all their pets spamming abilities.
And you’re getting me wrong here, these graphs don’t interpret the challenge that consoles have.
PCs have their own memory pool for CPU. It doesn’t matter how much data they are modifying per second, there will always be sufficient memory bandwidth for CPU to do its work.
Once you start cutting that memory out of GPU bandwidth, that’s a little different. You may want to compare consoles against integrated chipsets to get a better idea of cpu eating bandwidth is causing a problem.
Yea but the console version is using FSR 2? If I wanted a more stable framerate on the 3050, I'd just turn on DLSS2 and get a similar outcome to consoles? Issue solved.
That's a really long winded way to say "lazy devs".
It's hard to believe that games aren't doing basic residency management. No platform has unlimited VRAM.