Digital Foundry Article Technical Discussion [2024]

It is no doubt a sad showing, but load can vary wildly depending upon the scene - both in area and time of day (night can deliver 50%+ boost to framerate). Even in an area without heavy use of God rays which can really impact Nvidia in particular on Ultra (there's a mod which fixes this), I can easily drop below 30fps on my 3060 at 4K/Ultra (no HBAO or physx) - hell even 4k/high will teter close to the 30fps edge - so I'd have to see where that measurement of the 290x was taken. PS5/SX can be around 3060ti in pure raster performance, so they would still need to employ cut back settings or dynamic res heavily to maintain 4k/60, even with more attention given to this port.


edit: Just putting the shadow distance and godrays to medium, leaving everything else at Ultra more than doubles my framerate to a solid 60 at 4k, yowza

Looking at this video the 1080ti runs at locked 60fps around 90% of the time.

So I still don't think there's any reason why these consoles don't just offer a single mode that's native 4k/60fps.

Maybe it would require the odd setting tweak but there should be no reason for dynamic resolution or the need to offer a 30fps mode.
 
Looking at this video the 1080ti runs at locked 60fps around 90% of the time.

Yes, those are mostly indoors and the opening area which are less taxing than further out into the wasteland. On Ultra, I can halve my framerate - or more - depending on which direction I'm looking if I'm in the forest.

Bear in mind this may be a PC specific performance regression with Ultra godrays. For example:

4k Ultra, 3060:

1714766303570.png



Turn around, walk 2 minutes in another direction:


1714766406179.png

...but then put everything on Ultra, but just Godrays medium:

1714767154524.png

It's possible Ampere, or the 3060 in particular suffers greatly from the Godrays setting than the 1080ti doesn't, or a latter Nvidia driver fucked up this setting too. So there might have to be some settings tweaks for 4k/60 on console, but yeah you're probably right - pretty minor.

Especially considering there's a mod that gives you high/Ultra Godrays quality but giving the performance of Low/medium.
 
Last edited:
Are you referring to shader compilation stutter or poor quality in general? I think DF makes a bit of a meal out of the whole stutter issue but I think it’s important that somebody does. It also serves as a useful lightning rod to draw attention to the poor state of game releases in general.

That was the main line of discussion but I feel the sentiment applies to overall performance issues in games and things like day 1 vs. patched improvements.

I do agree that awareness of the issue is important and we already see some steps in terms of trying to tackle the issues involved.

But at the same time I just don't feel the extreme reaction on the other spectrum is beneficial either to most of the gaming audience. Especially in terms of the idea that's been presented that developers should focus less on pushing and being more conservative if it means a more streamlined experience out of the box.
 
For the Fallout 4 discussion I think it's worth keeping in mind how performance in that game based on PC testing is known to be very dependent on on the CPU and memory side.

Zen 2 CPUs (faster than on the consoles) with lower latency memory for example could not actually peg the game at 60 fps in the famous Corvega sections and equivalents. It wasn't until Zen 3 CPUs on the AMD side in which you true 60 fps CPU performance for Fallout 4.
 
Unlike they claim, Weapon Debris isn't broken. It works on GTX 10 and earlier, it's Turing and newer (RTX and GTX 16) issue only.

So that just means it's broken for millions of other cards released in the past 6 years.

Bugs manifesting on some hardware and not others can still be considered a bug, and releasing an update for an old game is the perfect opportunity to quelch it. Maybe it's more on Nvidia side, who knows - but it's up to Bethesda to coordinate with Nvidia and fix it, or modify the effect.

The problem as well is that this just isn't a case of a certain effect not being available, the way this bug manifests is random crashes/hard locks - I've already seen several posts on forums where people say "Oh god I had no idea that was the cause!". The troubleshooting for the end user is made more complicated due to the massive mod community, this bug can lead new users down a rabbit hole of wasted time as they might assume the random crashes are due to a mod they've installed. People with modern hardware for such an old game will crank all the sliders up to 11, their first impression is going to be one of a broken game.

The effect is hardly critical to the game sure, but at the very least not even bothering to grey out the option for a wide range of hardware you know can't run it without seriously compromising stability is incompetence.
 
Last edited:
70W of power consumption on batteries including the OLED display, more efficient than a Xbox Series S

 
For the Fallout 4 discussion I think it's worth keeping in mind how performance in that game based on PC testing is known to be very dependent on on the CPU and memory side.

Zen 2 CPUs (faster than on the consoles) with lower latency memory for example could not actually peg the game at 60 fps in the famous Corvega sections and equivalents. It wasn't until Zen 3 CPUs on the AMD side in which you true 60 fps CPU performance for Fallout 4.
While true, I feel any higher effort could have gotten things running better on these 'to the metal' next gen consoles than any PC equivalent hardware. Especially since a lot of the CPU bottlenecks on PC were very memory performance-dependent, something that can be better addressed on console.
 
Especially in terms of the idea that's been presented that developers should focus less on pushing and being more conservative if it means a more streamlined experience out of the box.

Reducing ambition is just one option and I didn’t see DF suggesting that games should do less. They’re demanding better QA and high standards for release quality. Very reasonable ask I think.

I wouldn’t characterize the frustration as wanting a “more streamlined experience”. It’s more of a “not broken experience”.
 
0:01:41 News 01: Bethesda announces big Starfield update
0:23:05 News 02: AMD sees massive gaming revenue decline
0:40:14 News 03: New PS5 Pro GPU details!
0:58:27 News 04: Switch 2 rumour roundup
1:07:40 News 05: RTX Remix getting DLSS 3.5 Ray Reconstruction
1:16:02 News 06: Resident Evil titles get path tracing!
1:27:14 News 07: AMD Strix APUs pack great integrated GPU performance
1:35:06 Supporter Q1: Could Nintendo games use DLSS 2 to reach 4K output on Switch 2?
1:40:23 Supporter Q2: What features would you add to Switch 2, if you could pick anything?
1:45:25 Supporter Q3: Could the potential merging of Xbox and PC development hurt Xbox?
1:50:13 Supporter Q4: Could developers run a game’s logic at high rates to improve responsiveness, while keeping frame-rate untouched?
1:54:19 Supporter Q5: Gaming handhelds theoretically seem as fast as a Series S. Why are they slower in practice?
1:57:40 Supporter Q6: Could Valve build a viable console platform to compete with Sony?
2:03:16 Supporter Q7: If you had to pick between #StutterStruggle and FSR 2 artifacts, which would you pick?
 
0:01:41 News 01: Bethesda announces big Starfield update
0:23:05 News 02: AMD sees massive gaming revenue decline
0:40:14 News 03: New PS5 Pro GPU details!
0:58:27 News 04: Switch 2 rumour roundup
1:07:40 News 05: RTX Remix getting DLSS 3.5 Ray Reconstruction
1:16:02 News 06: Resident Evil titles get path tracing!
1:27:14 News 07: AMD Strix APUs pack great integrated GPU performance
1:35:06 Supporter Q1: Could Nintendo games use DLSS 2 to reach 4K output on Switch 2?
1:40:23 Supporter Q2: What features would you add to Switch 2, if you could pick anything?
1:45:25 Supporter Q3: Could the potential merging of Xbox and PC development hurt Xbox?
1:50:13 Supporter Q4: Could developers run a game’s logic at high rates to improve responsiveness, while keeping frame-rate untouched?
1:54:19 Supporter Q5: Gaming handhelds theoretically seem as fast as a Series S. Why are they slower in practice?
1:57:40 Supporter Q6: Could Valve build a viable console platform to compete with Sony?
2:03:16 Supporter Q7: If you had to pick between #StutterStruggle and FSR 2 artifacts, which would you pick?
Remember people going on and on about Hardware VRS and Mesh Shaders? Now they are being added to the PS5 Pro perhaps we'll get better use of them now?
 
Remember people going on and on about Hardware VRS and Mesh Shaders? Now they are being added to the PS5 Pro perhaps we'll get better use of them now?
Not necessarily.

Using hardware based features doesn’t necessarily mean it’s better than the software variant. But for some studios that are using their own engine, that can’t roll what UE5 is doing, having the option of 3D pipeline variant is better than none at all. Helps in that they don’t need to build everything from scratch. Makes it easier to adopt if they need it.
 
Can we necrobump the Bandwidth per CU discussions? Or Wide is bad and narrow is good?

DF reports 16 WGP per shader engine. Well beyond XSX’s 14. With marginally more bandwidth on a smaller bus 256 vs 320, 4MB L2 vs 5MB L2 on XSX.

Hopefully this puts an end to this silly metric.

Are we validated in saying that PS5 was not RDNA2. We can see now that they take full VRS and Mesh Shaders. I assume it’s full DX12U compliant and some now.
 
Last edited:
Can we necrobump the Bandwidth per CU discussions? Or Wide is bad and narrow is good?

DF reports 16 WGP per shader engine. Well beyond XSX’s 14. With marginally more bandwidth on a smaller bus 256 vs 320, 4MB L2 vs 5MB L2 on XSX.

Hopefully this puts an end to this silly metric.

Are we validated in saying that PS5 was not RDNA2. We can see now that they take full VRS and Mesh Shaders. I assume it’s full DX12U compliant and some now.
Also can we start asuming that further in generation performance gap will be bigger on ps5pro vs ps5 as only new games can utilize wide arch. ? 😅
 
Also can we start asuming that further in generation performance gap will be bigger on ps5pro vs ps5 as only new games can utilize wide arch. ? 😅
lol. That’s a good question.

I think SX has always been handicapped from the lack of optimization or design for the hardware. I don’t know what this means for Xbox and with PS5Pro having a similar configuration maybe it’s worth doing that level of optimization for it?

But I honestly don’t think it’s wide array optimization. Everything in GDC showcases how GNM and DX12 are very different APIs and how they resolve issues are different even though the hardware is the same.

The UE5 team has a great example on compacting NOP submissions. And it comes across very different on Xbox and PS5. And if you decide to “port” PS5 code to DX12, it will work but may not actually be performant.

When I think about optimization, that’s what I think is happening.
 
Back
Top