Digital Foundry Article Technical Discussion [2024]

I just ordered a custom PC build for my sister, with a 7800X3D + AIO cooler, 32GB DDR5-6000 RAM, case + power, etc. for around US$950. That includes assembly and shipping. It does not include SSD and GPU because my sister already has them, but you can add a reasonably good 2TB SSD for ~$150 and a 4090 for ~$2,000. That means you can get a top of the line PC for less than $3,500. If you settle for a 4080 which is ~$1,200, you can get one for less than $2,500.

Had a quick peek on newegg and looks like a good but cheap 4080 super is around 1k and a 2tb ssd around 100.

So about ~2150 for a top end setup. Use the money saved over a 4090 to get a 32inch 4k qd oled monitor and you’re set for a very long time.

On a side note always amusing to read people arguing over graphics and you come to find out they’re making these observations on some run of the mill display running in horribly setup out of the box “game mode.”
 
Had a quick peek on newegg and looks like a good but cheap 4080 super is around 1k and a 2tb ssd around 100.

So about ~2150 for a top end setup. Use the money saved over a 4090 to get a 32inch 4k qd oled monitor and you’re set for a very long time.

On a side note always amusing to read people arguing over graphics and you come to find out they’re making these observations on some run of the mill display running in horribly setup out of the box “game mode.”

I wouldn't even get an OLED monitor, I would get a nice mini-LED monitor with FALD instead.
 

"Hippogryph"
I think Richard is wrong if he thinks games would get 45% of improvements without any patches. Games will get the full 45% improvements only if they get patched to the full 60 CUs GPU, which we know from PS4 Pro likely won't happen using Sony close to the metal APIs. So on PS5 Pro and without any patch we can only get improvements from 10% CPU boost, 28% bandwith gains and the little GPU clocks improvements we'll get, likely from 2.23 to 2.35ghz : 5%.

So my guess is we'll probably get around 10% improvements in most boosted titles like Elden Ring on PS5. I really don't think From Software will patch the game for a native PS5 Pro version like using PSSR, new RT hardware or using 60 CUs instead of 36 CUs.
 
  • Like
Reactions: snc
I think Richard is wrong if he thinks games would get 45% of improvements without any patches. Games will get the full 45% improvements only if they get patched to the full 60 CUs GPU, which we know from PS4 Pro likely won't happen using Sony close to the metal APIs. So on PS5 Pro and without any patch we can only get improvements from 10% CPU boost, 28% bandwith gains and the little GPU clocks improvements we'll get, likely from 2.23 to 2.35ghz : 5%.

So my guess is we'll probably get around 10% improvements in most boosted titles like Elden Ring on PS5. I really don't think From Software will patch the game for a native PS5 Pro version like using PSSR, new RT hardware or using 60 CUs instead of 36 CUs.
We will need to wait to know more on it. I assume these are patched to run the whole 60CU if PSSR is enabled.

But as I wrote earlier, to take full advantage of 5pro, you’ll need a lot more than a patch. You need to fully replumb the entire game.
 
We will need to wait to know more on it. I assume these are patched to run the whole 60CU if PSSR is enabled.

But as I wrote earlier, to take full advantage of 5pro, you’ll need a lot more than a patch. You need to fully replumb the entire game.
But ultra boost mode looks to work a lot like PS4 Pro boost mode in which games only got improvements from raw overclocks gains from GPU (~11%) and CPU (~33%) and bandwidth. We got different results depending of game's bottleneck and in almost all cases it was logical and based on the game's bottlenecks.

AFAIK only one game (out of thousands), BF4, got improvements on PS4 Pro that were well beyond the overclocked hardware and allegedly due to how they coded the game. Supposedly they used a layer API that was not targeting specifically the 18 CUs but whatever CUs was available. Because in that game we got 100% improvement in framerate in some very specific scenes using physics which would imply this game, without a patch, would somehow be able to use the full 36 CU GPU at least in the scenes where physics were involved. But all of this is guess work.
 
Game choice matters less than demonstrating more marked improvements.
Maybe. I think if you showed a PS5 exclusive game and showed the improvements it matters more than showing a PS4 game and showing marked improvement.

Or if you don't care all that much about RT and happy with FSR2, you can pick up a 7900XTX and have 4080 performance for a lot less.
I've found Lossless Scaling's LS1 to have similar performance with less of the annoying artifacts that FSR has. There are still artifacts, but I don't find them as glaring.
 
But ultra boost mode looks to work a lot like PS4 Pro boost mode in which games only got improvements from raw overclocks gains from GPU (~11%) and CPU (~33%) and bandwidth. We got different results depending of game's bottleneck and in almost all cases it was logical and based on the game's bottlenecks.

AFAIK only one game (out of thousands), BF4, got improvements on PS4 Pro that were well beyond the overclocked hardware and allegedly due to how they coded the game. Supposedly they used a layer API that was not targeting specifically the 18 CUs but whatever CUs was available. Because in that game we got 100% improvement in framerate in some very specific scenes using physics which would imply this game, without a patch, would somehow be able to use the full 36 CU GPU at least in the scenes where physics were involved. But all of this is guess work.
While I’m am not an authority on the topic, I’m about 100% sure you cannot explicitly tell the CP how many CUs you want your work to run over. Developers have no control over this, though drivers likely would. So if it’s patched to support PSSR, it’s likely using everything available. It’s very unlikely that it’s just running 36CUs while 24 sit idle. That would be a terrible way to present your new hardware. But also the way that was done in the past was to actually shut off an entire shader engine to accomplish this.

I can’t see them doing that for 5pro. Killing a shader engine will shut down a lot more than just CUs.
 
Going to dump this reset era post here since I know I’ll bring it up later but no one will understand where I’m coming from.


I think Drevil posts here from time to time or my memory is fading.

But just wanted to point out again how game optimization is done. And why it’s entirely possible that you don’t see a lot of improvement here possibly for XSX or maybe even 5pro. Something to note as to why you would expect more to be extracted but aren’t seeing it.

Please note the part I bolded. If your lead platform has met its job, you stop. That could also imply that If you swing over to another platform and it’s under, you just move on. they don’t beef it up just because there might be more headroom.

Honestly, and I've personally had to optimize levels in shipping titles, if your tools say you're in under your memory caps or mesh budget, you kinda just.. stop. There's no sense in going much further because you move onto something else to fix :p

And also, what I meant by engine optimization - especially for something like UE5, is that different platforms will handle the same visual effect/mode differently than others. Xbox handles things different than Playstation just due to physical architecture and OS/drivers/bandwidth/etc

It’s entirely understandable that the 20%-45% additional headroom isn’t enough there across the bottlenecks to bother investment to beef things up further.

Whereas with X1X, there was sufficient headroom across the entire GPU to crank the resolution up without penalty in all areas of the game.

Sadly, from this perspective, if developers can’t economically find a way to maximize more from these consoles, the only remaining thing to do, is to spend those idle resources doing things like AI upscaling etc.
 
Last edited:
While I’m am not an authority on the topic, I’m about 100% sure you cannot explicitly tell the CP how many CUs you want your work to run over. Developers have no control over this, though drivers likely would. So if it’s patched to support PSSR, it’s likely using everything available. It’s very unlikely that it’s just running 36CUs while 24 sit idle. That would be a terrible way to present your new hardware. But also the way that was done in the past was to actually shut off an entire shader engine to accomplish this.

I can’t see them doing that for 5pro. Killing a shader engine will shut down a lot more than just CUs.
I just told this was how all games, bar one in one scene, were working under PS4 Pro boost. They were not benefiting from the others 18 CUs availlable. This is how Sony APIs worked in reality with boosted clocks and improved CUs. We can only assume it will the work the same way for PS5 Pro. If they patch for PSSR maybe they would be able to patch to use the others 24 CUs, but that's not boost mode without patching.

Without patch we can only suppose it will work like PS4 Pro. What you want is unrealistic expectations based on nothing.
 
I just told this was how all games, bar one in one scene, were working under PS4 Pro boost. They were not benefiting from the others 18 CUs availlable. This is how Sony APIs worked in reality with boosted clocks and improved CUs. We can only assume it will the work the same way for PS5 Pro. If they patch for PSSR maybe they would be able to patch to use the others 24 CUs, but that's not boost mode without patching.

Without patch we can only suppose it will work like PS4 Pro. What you want is unrealistic expectations based on nothing.
4Pro was a butterfly design. It was 2xPS4s on the same chip. When they needed BC, they shut off 1 entire shader engine. If it was patched it could use the entire chip.
 
I'd love to know what the average resolution would be of the games DF has tested on both PS5 and SeriesX in both performance and quality modes.

I wonder if they've ever tried to figure it out.

It would be fun if we all took guesses and could see who's the closest.
 
I'd love to know what the average resolution would be of the games DF has tested on both PS5 and SeriesX in both performance and quality modes.

I wonder if they've ever tried to figure it out.

It would be fun if we all took guesses and could see who's the closest.
 
Maybe. I think if you showed a PS5 exclusive game and showed the improvements it matters more than showing a PS4 game and showing marked improvement.
If that PS4 game looked markedly better than current PS5 games that would be a better sell than a marginally improved PS5 exclusive IMO. I don’t feel they came close to offering gamers a compelling reason to shell out at least 700 for an upgrade.
 
Back
Top