Digital Foundry Article Technical Discussion [2024]

That's why I mentioned that there are undoubtedly advantages to UE5 game engines, which mostly applies to development. As I mentioned earlier, now that game developments based on such a modern GPU driven pipeline are appearing in large numbers, the difference is obvious. It sounds good that there is almost infinite geometry in the picture, but it is not so good that this graphic developed with huge amount of geometry is displayed on the console in too low image resolution, taking into account the general size of today's TVs.

You mentioned that the Nanite technology cannot currently be scaled down, that is, it only works as an on/off switch and therefore requires a lot of power when turned on. Is there a way to change this in the future? Are there any improvements in this direction that could benefit cheaper console hardware?
There has been a lot of optimization to nanite (and lumen) in UE 5.1 to 5.5 and they continue to optimize further. But of course we are in diminishing returns territory unless they have an eureka moment. Skeletal meshes becoming nanite might help since "nanite culls nanite" which means that skeletal meshes will be able to cull environment more aggressively and in turn the skeletal meshes themself will be culled more aggressively by the environment as well as well. But if this is a net win only time will tell.

But honestly my take on UE5 is that the engine has a lot of overhead because it's a general purpose engine. Epic has a lot of toggles in place to disable some of these features but the more toggles you add the harder the code will be to maintain so it's a balancing act for Epic. I bet there is some low hanging fruit in there (for instance the G-buffer of UE was pretty fat the last time I looked at it. But that might have changed now I haven't looked at it recently). The small teams that use UE just don't have the man-power, experience or time to dive deep into the engine code and identify the features that add to this overhead that they themself don't need. Also because the engine has to many features and use cases it's harder to modify as more system might depend on each other.

But the problem is not only UE, Unity has the same problems. Remember City Skylines. The dev's used a middle-ware for the characters. That middleware produced very high-poly characters, including teeth (who needs teeth in a top-down city building game). It's a simple thing but the team missed that (and they missed more other stuff too) and shipped. The point is, smaller teams without custom engines have less in house tools and use middleware (and of course the engine itself can also be seen as middleware) that they don't understand fully. All middleware is general purpose and will add overhead. You need to fully understand it and have time to optimize that. But most smaller teams without a custom engine will have less engineers while teams that have a custom engine also have in-house engineers who understand the tech and can educate their artists. Another thing that does not help is all the lay-offs, teams form and fall apart and each time they need to start from almost scratch. There is an upside to all this, thanks to these easy to use engines there are more devs making games than ever so you now have a choice. If a game runs bad just don't buy it as there are now more games coming out than you have time to play anyway.
 

DF Direct Weekly #187: Concord Dead, Indiana Jones Hands-On, Horizon PC Tested, 'Disaster Remasters'​


0:00:00 Introduction
0:01:13 News 1: Concord developer shuttered, game not coming back
0:18:16 News 2: Indiana Jones previewed!
0:31:56 News 3: Mario & Luigi: Brothership evaluated
0:42:47 News 4: Horizon Zero Dawn Remastered analyzed on PC
0:55:00 News 5: Shadows of the Damned is a ‘disaster remaster’
1:05:32 News 6: Alex’s Star Citizen update
1:21:01 Supporter Q1: Do you still think mid-gen enhanced consoles are unnecessary?
1:28:29 Supporter Q2: Will PS5 Pro owners typically own 120Hz VRR panels?
1:34:43 Supporter Q3: Could the Series X support a form of ML-based upscaling?
1:37:23 Supporter Q4: Why did Call of Duty abandon ray tracing?
1:40:57 Supporter Q5: Why has checkerboard rendering disappeared from most modern games?
1:43:33 Supporter Q6: What has John contributed to the upcoming EGM Compendium?
1:47:02 Supporter Q7: Will Capcom adopt a more PC-centric approach in the future?
1:51:45 Supporter Q8: Will Nintendo pioneer revolutionary uses of RT on Switch 2?
 

It's back.
Ollie was right about Metaphor.

Honestly didn’t think it stood a chance. Never seen a worse optimized game.

I’m not quite sure what bottleneck was opened up with 5pro to allow it to shift.
 
Ollie was right about Metaphor.

Honestly didn’t think it stood a chance. Never seen a worse optimized game.

I’m not quite sure what bottleneck was opened up with 5pro to allow it to shift.
I am not quite sure how Sony could make 30 to 40% improvements in GPU limited scenes on unpatched games. And John said some scenes in DMC5 showed higher improvements (too bad they don't show them).

Are they using some kind of software emulation layer like Xbox to run those games?
 
I am not quite sure how Sony could make 30 to 40% improvements in GPU limited scenes on unpatched games. And John said some scenes in DMC5 showed higher improvements (too bad they don't show them).

Are they using some kind of software emulation layer like Xbox to run those games?
If we assume emulation then that is straight forward, we assume they have been doing this since PS5.

If no emulation, it gets trickier to here would be a lot of discussion on how this would work.

But in either scenario, emulation or not, both have access to much larger caches and much more bandwidth. Which I think is a big part of the performance boost happening here.

Having more compute doesn’t do anything if you’re sitting idle waiting for work to do.

In this aspect 5pro is has significantly more than PS5, and even if running the same 36CUs, it’s going to not sit idle as long both in cache and when pulling from VRAM
 
If we assume emulation then that is straight forward, we assume they have been doing this since PS5.

If no emulation, it gets trickier to here would be a lot of discussion on how this would work.

But in either scenario, emulation or not, both have access to much larger caches and much more bandwidth. Which I think is a big part of the performance boost happening here.

Having more compute doesn’t do anything if you’re sitting idle waiting for work to do.

In this aspect 5pro is has significantly more than PS5, and even if running the same 36CUs, it’s going to not sit idle as long both in cache and when pulling from VRAM
PS4 BC on PS5 (and PS4 on PS4 Pro) were both doing strict hardware BC. And we could see both the GPU overclock and CPU overclock depending of the games. On PS4 Pro the only games having up to 30% better performance were the ones CPU limited. GPU limited games (like Killzone SF) were having about 10% or less better performance only (here some GPU limited games have a 40% boost which I didn't expect at all).

Also I think performance will likely improve with patches (as emulation will get better, something that was not possible with hardware BC obviously). Performance of PS5 Pro games could also improve, the same way we got a boost of performance in some PS5 games after an OS patch.
 
I am not quite sure how Sony could make 30 to 40% improvements in GPU limited scenes on unpatched games. And John said some scenes in DMC5 showed higher improvements (too bad they don't show them).

Are they using some kind of software emulation layer like Xbox to run those games?

It's no different to taking a 2013 PC game running on Windows 7 to a brand new PC running a 4090 and Windows 11.
 
PS4 BC on PS5 (and PS4 on PS4 Pro) were both doing strict hardware BC. And we could see both the GPU overclock and CPU overclock depending of the games. On PS4 Pro the only games having up to 30% better performance were the ones CPU limited. GPU limited games (like Killzone SF) were having about 10% or less better performance only (here some GPU limited games have a 40% boost which I didn't expect at all).

Also I think performance will likely improve with patches (as emulation will get better, something that was not possible with hardware BC obviously). Performance of PS5 Pro games could also improve, the same way we got a boost of performance in some PS5 games after an OS patch.

Not sure if "emulation" really comes into it, at least in the sense how most people think of emulation. The restriction of PS4's Boost Mode not allowing games to take advantage of the extra CU's could have just been a limitation of the API/SDK at the time, the PS5's development environment was likely designed so games wouldn't shit the bed if they encountered different hardware (within reason). The PS5 Pro was reportedly already well in the works when the PS5 first shipped.
 

DF Direct Weekly #188: Death Stranding Xbox, God of War Ragnarök PS5 Pro, Switch 2 Back Compat!​

0:00:00 Introduction
0:01:08 News 1: God of War Ragnarök PS5 Pro patch tested!
0:13:36 News 2: Nintendo confirms Switch 2 backwards compatibility
0:25:27 News 3: Death Stranding lands on Xbox
0:38:11 News 4: No Man’s Sky patched for PS5 Pro
0:47:49 News 5: Sony INZONE M10S impressions
1:05:53 News 6: Sonic Generations can run at 60fps on Switch
1:11:42 Supporter Q1: Will you use the 9800X3D for high-end gaming benchmarks?
1:16:21 Supporter Q2: Will PS6 use 3D V-Cache?
1:17:32 Supporter Q3: Will Alex and John switch to 9800X3D?
1:22:15 Supporter Q4: Why is Game Boost falling short of the promised 45% increase to PS5 Pro raster performance?
1:29:21 Supporter Q5: Would AI frame extrapolation make native frame-rate unimportant?
1:32:30 Supporter Q6: What do you want out of a Steam Deck 2?
1:38:32 Supporter Q7: Why didn’t you spend more time analyzing the PS5 Pro box?
 
  • Like
Reactions: snc
480Hz or bust. We need better frame gen to get from 120 or 240 up to 480 and higher. Glad to hear John praising 480Hz.

Don't get too used to it, John! You'll have a hard time playing those console games.
 
I really wish they would work on eliminating sample and hold altogether instead of chasing ever higher refresh rates. We don’t need frame gen. Our eyes and brains are already designed to do interpolation. It’s just a bandage on a fundamentally broken display tech.
 
I really wish they would work on eliminating sample and hold altogether instead of chasing ever higher refresh rates. We don’t need frame gen. Our eyes and brains are already designed to do interpolation. It’s just a bandage on a fundamentally broken display tech.

The asus 480Hz oled supports BFI, so you can run it at 240Hz with BFI. Honestly though, the idea that your eyes will just do the interpolation from a lower frame rate like 120 and achieve the same result as real 480Hz ... not sure I buy it. Films have motion-blur baked in with shutter speed, and the language of cinematography basically works around what cameras are not good at (panning).

MPRT (moving picture response time) does improve greatly as you move up in Hz, but that's not the only benefit increasing fps. Assuming you had a display with 1ms MPRT and 120Hz and another display with 1ms MPRT and 480Hz, I think the 480Hz display will still look a lot smoother in motion.

Unless you moved away from rendering frames entirely, and you just wrote out pixels somehow so you didn't have to worry about vsync or tearing ... I don't know. I think we're going to get to good frame gen way before we totally overturn how gpus render frames.
 
The asus 480Hz oled supports BFI, so you can run it at 240Hz with BFI. Honestly though, the idea that your eyes will just do the interpolation from a lower frame rate like 120 and achieve the same result as real 480Hz ... not sure I buy it. Films have motion-blur baked in with shutter speed, and the language of cinematography basically works around what cameras are not good at (panning).

MPRT (moving picture response time) does improve greatly as you move up in Hz, but that's not the only benefit increasing fps. Assuming you had a display with 1ms MPRT and 120Hz and another display with 1ms MPRT and 480Hz, I think the 480Hz display will still look a lot smoother in motion.

Unless you moved away from rendering frames entirely, and you just wrote out pixels somehow so you didn't have to worry about vsync or tearing ... I don't know. I think we're going to get to good frame gen way before we totally overturn how gpus render frames.

Yes a higher refresh sample and hold display is better than a lower refresh sample and hold display.

I would bet serious money that a 120hz strobing display with “brain interpolation” will look better than a 480hz sample and hold display with frame gen.
 
Back
Top