Digital Foundry Article Technical Discussion [2024]

All those games are GPU and compute limited without any kind of hardware RT (which is CPU demanding). Getting 13% better performance with 16% more compute and 25% more bandwidth is actually disappointing.

Average of 13% is ok IMO. Disappointing for me would be when there's no advantage to the extra compute and BW, or even worse when the Series X version is noticeably worse. Not uncommon in cross gen games where it's basically last gen geometry and shaders and you're scaling for resolution. Stuff targetting current gen is increasingly showing what the Series X was really built for IMO.

There's a whole other angle about how maybe 13% is pretty good when PS5 has outsold Series consoles 2:1 (and Series X 4:1), but that's beyond the scope of something DF (or anyone else) can properly quantify.

It does bode well for UE5 though, and it indicates that like the PS5 the Series consoles really do love lots of geometry if handed appropriately.

Others kind of games can run better on PS5 like Baldur's Gate 3 (CPU limited?), Cyberpunk 2077 (CPU / I/O limited) or Hogwarts Legacy running better in fidelity RT mode and open-world I/O streaming (XSX being better in fidelity mode when there is no I/O involved or without hardware RT similar to those latest UE5 games).

Baldurs Gate 3 is definitely the CPU in the act 3 city, and possibly helped on PS5 by the fact the game can tear in both the top and bottom of the screen and across a greater percentage of the vertical height. It's probably struggling with an AI or scripting task and so unrelated to API, and down to the code the developers are using (or the compiler or something like that).

Cyberpunk on Series X has a minimum resolution window that's some 30% higher than on PS5, and it's using slightly higher LOD settings. Given the same lower bounds and LOD I think it would run about as well or maybe better. The Phantom Liberty preview that was matched on both PS5 and Series X had a fixed resolution and ran noticeably better on Series X.

Series consoles can stream in BVH data direct from SSD just like PS5, but it's up to developers to do that.

XSX is more powerfull in some games, PS5 is more powerful in others games.

True, but the move to lower native resolutions, higher per pixel workloads, and micro polys appears to be benefiting the Series consoles relatively more than the PS5 - as you might expect from a system with lots of compute and bandwidth.
 
There's a whole other angle about how maybe 13% is pretty good when PS5 has outsold Series consoles 2:1 (and Series X 4:1), but that's beyond the scope of something DF (or anyone else) can properly quantify.
There's also an angle on what does 13% actually get you? How much real difference does it make? I think even more than last gen, it makes zero difference to the players on either machine; the differences are so small they'd never know except reading about it and it has no bearing on which machine to buy. Contrast that with OXB versus PS2, say, and the more power definitely got you more and could be a deciding factor on what to buy.

From the PS3 heyday of very interesting results, now comparisons are just a few percentage points in a spreadsheet I think. Beyond the curiosity of investigating the difference, and the joys of the outliers where there's a notable difference, platform comparisons are all a bit meh.
 
I said it all the way back in 2020, the advantage XSX has is simply not enough to give it any meaningful advantage over PS5.

And at best case it would allow it to stabilise frame rate drops found on PS5.

There's a larger performance difference between the RTX 3090 and RTX 3090ti than there is PS5 vs XSX.
 
There's also an angle on what does 13% actually get you? How much real difference does it make?
it gets you 4ms at 30fps and 2ms at 60fps? Especially without VRR, 10% is a lot. An entire 2ms of cushion to keep a locked 60fps target is great, you can get a lot done in 2ms.
 
1) If there is any game that runs well on UE5, that shows UE5 isn't the problem.

2) Devs don't have to use features of an engine if they don't run well. The fact a game engine can scale up to uber levels doesn't mean that needs to happen. Use UE5 and use baked lights instead of Lumen - that's an option. Nanite too demanding? Don't use it. Use the same techniques used prior to these techs. If devs throw too much at a game, it'll crawl on any engine. See Crysis 2! UE5 was not created just for consoles, but for many platforms including scaling up to ultra PCs (and even driving Hollywood FX), but taking an Uber PC game and sticking it on consoles isn't going to work well.


Yes and no. If it's because the devs aren't managing expectations and are throwing all the pretty bells and whistles because it makes for gorgeous screenshots that drives early interest, that's on the devs. And that's what a lot seem to be doing, chasing the 'next gen' look even if the hardware isn't up to it. "Wow check out this lumen! that looks great. Woah, see this Nanite geometry! Amazing, let's get that in there. Okay, let's build for console...oh"

UE5 provides plenty of flexibility to use different techniques. That moves the onus onto the developers to choose wisely, and it seems a lot can't. But we know this. The moment 'post processing' became a thing, we got excessive brown-o-vision, bloom and chromatic aberration. These were layered on with a trowel just to ram home that 'next gen' look. Give devs some amazing lighting results and geometry options and they'll pile it on and then step back and admire their visuals, ignoring the framerate. This isn't anything new. There were plenty of 20 fps games on PS2, pushing the graphical (or gameplay) features beyond the hardware's ability to run them at a decent framerate.

So going back to point one, where you say it's UE5's fault, the existence of one game that uses features and runs well proves it's not an engine limitation. The engine can run well, also using high end features. Either design your game to fit the features so they'll run well, or pick the features that'll run your game at the preferred framerate.

We can only point to UE5 being bad if there are other engines achieving the same visuals/features at better framerates. In the absence of that, there's no evidence other than circumstantial that UE5 is dodge, and that's not a logical nor fair conclusion to jump to.
And the million dollar question is: what defines a good engine?
 
I read on Reddit that nVidia has stopped supporting GPU-Physx with Lovelace. So playing something like these Batman games will crash or showing some strange physic problems.
 
A trip down memory lane with PhysX.

I had that game, it was by far one of the best games on the X360 -the brownie filter looked odd but the game also looked clean, jaggies wise iirc-. I didn't play it as much as it deserved though, 'cos I was so much into Oblivion back then. Seeing mountains in the distance and actually having the possibility of just walking there in a seamless map was so extraordinary at the time.
 
I read on Reddit that nVidia has stopped supporting GPU-Physx with Lovelace. So playing something like these Batman games will crash or showing some strange physic problems.

That's interesting, I've definitely had Arkham City running on my 4070Ti with no problem in the past, but more recently I seem to recall trying to start it up and it crashing. I will test again when I'm home later.
 
That's interesting, I've definitely had Arkham City running on my 4070Ti with no problem in the past, but more recently I seem to recall trying to start it up and it crashing. I will test again when I'm home later.
Asylum crashed with the latest Physx version, with an older one the Physx effects didnt work. City crashed for me, too.
 
Maybe the best engine is the one with the best compromise between performance and empowering the artist. Some another good question would be how an image is a composition of technique and how intricate are art and realtime techonolgy too. What is pixel peeing and why photography analogy with realtime rendering is interesting?

And I like to talk about this interview and the importance of good tools too.


Decima Engine have very good tools and many are inspired by unreal engine tools. Maybe Epic knows what they are doing.

EDIT: Maybe we have very powerful hardware and the problem is more on the software and API side @Lurkmass @Andrew Lauritzen @JoeJ
 
Asylum crashed with the latest Physx version, with an older one the Physx effects didnt work. City crashed for me, too.

Skill issue. :) All 3 work fine here on my 3060 without messing with any Physx installs. Only Physx game where I have a stability issue is with Black Flag and it's smoke effects.

I think Fallout 4 also crashes with GPU Physx effects enabled.

Yeah I think that's true, albeit I think that was always the case - more of a Bethesda thing rather than Nvidia fucking something up in a later driver I believe.
 
Last edited:
Back
Top