Digital Foundry Article Technical Discussion [2024]


Up next: Series S and Steam Deck.

Very respectable version of the game on Series, though at a lower resolution and with some settings reduced. The loss of Lumen RT reflections is a pity IMO, as at times it compromises the visual consistency beyond just things looking softer and noisier.

On Steam deck, DP4a XeSS once again better than FSR for a similar performance cost. Playable on Steam Deck but needs to be cut back to the point where it looks pretty rough.

Predictably, there's a point below which UE5 doesn't scale so well in terms of final output image.
the ROG ALLY has 8,3 teraflops. I had forgotten, and this video by Olie reminds me how crazy that is tbh.
 
Is it weird that I vastly prefer the Hellblade2 Series X visuals over whatever the maxed out PC looks like?

The animations especially the running look superior on series X as the motion blur smooths out the animation and gives it a more natural look. I prefer 4K 60fps in other titles, but on Series X it just looks more consistent IMO.

Does anyone understand?
 
Is it weird that I vastly prefer the Hellblade2 Series X visuals over whatever the maxed out PC looks like?

The animations especially the running look superior on series X as the motion blur smooths out the animation and gives it a more natural look. I prefer 4K 60fps in other titles, but on Series X it just looks more consistent IMO.

Does anyone understand?
You are talking about the soap opera effect. Sometimes I can see it in cutscenes, but during gameplay I have never felt it.
 
You are talking about the soap opera effect. Sometimes I can see it in cutscenes, but during gameplay I have never felt it.
The running animation at 60fps looks fake as you see the run-cycle repeating itself, whereas at 30 fps the blur hides it and your mind fills in the gaps. Also at 4K with the highest resolution you see for example the lava-splashes could be ‘texture-plains‘, they don’t even interact with the lava when they fall down. On the xbox series x settings however they are less ‘separated‘ from the lava and thus it you assume that it merges when it splashes down in the pool of lava as you don’t see it as clearly.

There are some other examples as well, but it is not basic “too smooth” as I prefer 60fps in most titles, it is just that the animation is probably keyframed at 30 or 24 and they interpolated from there
 
The running animation at 60fps looks fake as you see the run-cycle repeating itself, whereas at 30 fps the blur hides it and your mind fills in the gaps. Also at 4K with the highest resolution you see for example the lava-splashes could be ‘texture-plains‘, they don’t even interact with the lava when they fall down. On the xbox series x settings however they are less ‘separated‘ from the lava and thus it you assume that it merges when it splashes down in the pool of lava as you don’t see it as clearly.

There are some other examples as well, but it is not basic “too smooth” as I prefer 60fps in most titles, it is just that the animation is probably keyframed at 30 or 24 and they interpolated from there
You can lock the PC version to 30 fps or 24 fps if you care. Also, if you find the fidelity to high due to resolution, you can adjust the resolution down or downsample it to bluray quality.

There is nothing wrong with the PC version regarding its visual make up as far as I know.
 
You can lock the PC version to 30 fps or 24 fps if you care. Also, if you find the fidelity to high due to resolution, you can adjust the resolution down or downsample it to bluray quality.

There is nothing wrong with the PC version regarding its visual make up as far as I know.
I don't think he meant that the Xbox version is superior, just that he likes the 30 fps on the console more than a PC running at 60.
 
Is it weird that I vastly prefer the Hellblade2 Series X visuals over whatever the maxed out PC looks like?

The animations especially the running look superior on series X as the motion blur smooths out the animation and gives it a more natural look. I prefer 4K 60fps in other titles, but on Series X it just looks more consistent IMO.

Does anyone understand?
A poster with the username XboxKing doesn’t sound very impartial.
 
the ROG ALLY has 8,3 teraflops. I had forgotten, and this video by Olie reminds me how crazy that is tbh.

Dual issue very rarely gets used and the Ally only clocks around 2 GHz, so it's more like a 3 TF RDNA 2 part.

Yeah, I mean the APU in the ROG Ally is really cool, but as Subtlesnake says 8TF RDNA3 performs very much like a 4TF RDNA2 part.

From looking at Youtubes, it seems like the ROG Ally has GPU clocks that vary wildly in games, going as high as 2.5 and dropping as low as 800 mhz. For example, in some lighter games you can be typically above 2 ghz:


...but in others you're typically down well below 1.5 ghz:


While in some games you're struggling to maintain 1 ghz (though CPU for streaming and AMD Smartshift might be taking more power from the GPU here):


This is particularly interesting to me from the perspective of a handheld Xbox that's backwards compatible with the Series S. A 30W APU like the Z1 extreme might not be too far behind the S when it can turbo through the roof and bandwidth isn't an issue, but in demanding, BW heavy games it's pretty clear the Series S would crush it.

A wider GPU (20CUs compared to 12) would need to clock much less high than the ROG and therefore be more power efficient, but we've yet to see such a wide mobile APU. You'd also need to do something to massively boost memory bandwidth too, because contrary to much of the opinion on the internet the Series S actually has a lot of bandwidth to feed its GPU compared to other APUs (bar the Series X) and vastly, vastly more than any PC or mobile APU.
 
Last edited:
Kudos to @Dictator for pointing out the heavy PCIe bandwidth utilization and frame rate/time drops thereof that @yamaci17 also pointed out in the Ratchet and Clank thread. I've noticed the same in that game after reaching TorrenIV and it's a pretty bad user experience. Considering how Avatar does handle it much better and how Ghosts is just a PS4 port, this seems like an issue of how Nixxes in general handles the memory usage in these ports.

That is definately a crucial achilles heel in these Nixxes port and I hope Alex will forward his findings to Nixxes.

Thanks for making PC gaming better Alex, the suggestions you've made in your video would already help a lot.
 
Kudos to @Dictator for pointing out the heavy PCIe bandwidth utilization and frame rate/time drops thereof that @yamaci17 also pointed out in the Ratchet and Clank thread. I've noticed the same in that game after reaching TorrenIV and it's a pretty bad user experience. Considering how Avatar does handle it much better and how Ghosts is just a PS4 port, this seems like an issue of how Nixxes in general handles the memory usage in these ports.

That is definately a crucial achilles heel in these Nixxes port and I hope Alex will forward his findings to Nixxes.

Thanks for making PC gaming better Alex, the suggestions you've made in your video would already help a lot.
Yeah, I definitely noticed this moving my 2080 Ti from my PCIe 3.0 motherboard to my 4.0 one. The frame rates are much more stable on the 4.0 one. There are weird and random massive performance drops with PCIe 3.0. I don't understand why because it's 16x which should have more than enough bandwidth to handle any game two times over.
 
The frame time graph in the PCIe test makes it look more like an issue of how the PCIe bus is being used the total amount of data transferred each second.

You have a group of frames where the lower PCIe BW isn't hurting frame rate alternating with a frame or two where BW hurts a lot. I think this is probably because the total data transferred isn't being evenly spread out over time, but instead in periodic chunks for whatever reason.


I really don't think total bandwidth over e.g. a second is the issue, it should easily enough even on PCIe 3, the issue is far more likely how the game is using the bus. In the screenshot above it's saturating the bus roughly every ~150ms based on counting frames - and the frame time spikes are about as frequent for PCIe 3 x 16 as PCIe x 8, it's just that the spikes are smaller because the bandwidth is higher.

Dumping for example a hundred MB into memory a few times a second isn't going to cause a problem on a console with hundreds of GB/s, but that's more than PCI3 3 x8 can handle per frame at 60 fps even theoretically, assuming 100% bus efficiency, and no data being sent to the GPU to tell it how to draw the next frame.

@Dictator do you have any tools to measure PCIe traffic?
 

A lot of issues that other reviewers missed, kudos to Alex as always.

The camera panning stutter is very similar to what I saw with Horizon Forbidden West on my rig, albeit to a lesser degree than shown here. That absolutely has to be fixed, along with the brutal DOF/particle flickering in DLSS.

The fact that 8GB GPU's have to run with textures similar (or lower?) to the PS4 Pro in order to not be throttled at reconstructed resolution similar to CBR is...not great. This was also seen in Horizon Zero Dawn, albeit more with 6GB cards at the time (8GB will still show the odd lower res texture mip at high resolutions today). You're combining that with heavy PCIE traffic to boot vs, a system that's running the entire game in less than 5GB of ram demonstrates perhaps not the most optimal vram management.

That occlusion issue does indeed exist in the PS5 version as Alex notes, but those examples given seem to show the PC suffers from it more significantly. Hopefully we can see the same fixes they gave with Spiderman which had this issue as well, but this may be an engine problem.

Not sure if this is a case of Nixxes being stretched too thin, these custom engines presenting uniquely challenging porting conditions or a combination thereof, but it's getting a little frustrating for the requisite weeks-to-months wait for some of the more significant kinks to be (hopefully - Nixxes are pretty good at least at delivering solid patches) worked out from the 'god-tier' of PC porting studios. Some of these artifacts are pretty glaring and should not have to be raised by a major outlet like DF to be addressed.

Like I was going to chart the progress of H:FW on my PC with patches and I'll check it again eventually, but to be honest I'm getting a bit tired of patch roulette. :(
 
Last edited:
I swear to god I don't have this issue with Horizon FW. It's perfectly smooth when I rotate the camera.

I haven't bought Ghost of Tsushima yet since I've got so many other things I want to finish playing first.. but I have no reason to expect it wont be the same thing. Maybe it's a 60hz display issue?
 
Back
Top