Digital Foundry Article Technical Discussion [2024]



00:00 Intro/Ark Survival Ascended Upgrades
03:28 PS5 vs Xbox Series X/S Visual Comparisons
06:59 PC's Epic Settings Tested
08:13 Command Console Explained
10:04 Xbox Series S Frame-Rate Test
14:07 Xbox Series X Frame-Rate Test
16:19 PS5 Frame-Rate Test
19:08 Verdict
this is a very sad and pathetic console generation. Mediocrity galore. Pitiful. I don't find the words. At least Tom had fun tweaking the settings on the Series S.
 
Last edited:
yup, likely lower end textures. Kraken (better lossless compression) or memory footprint advantage seems likely.

I wonder if trying to keep certain assets within the GPU optimal 10GB of memory might lead to a big texture like the ground texture being reduced in size.
 
I wonder if trying to keep certain assets within the GPU optimal 10GB of memory might lead to a big texture like the ground texture being reduced in size.
I'm leaning towards texture compression differentials only because I'm expecting the UE5 virtual texturing system to be more than sufficient here in keeping texture pools small.
 
Nice, glad to see this get some coverage! It's definitely rough on console. IMO rather than chase 60 it would be a much better experience for them just to put in a properly paced vsync/2 30fps cap by default. Yes it can still fall below that occasionally but it would still make the majority experience a lot better. It's a shame that they decided to use volumetric clouds *as* fog as the performance cost is just ridiculous, but turning them off you lose a lot of visual appeal.

Having played many dozens of hours at this point though I'm sticking with my take that the game looks amazing, but it is definitely super heavy and doesn't scale down as well as one would like. You basically need a 4080+ class card to enjoy it at anywhere near max settings, although of course "max settings" are always kind of an arbitrary thing. I don't think the console versions necessarily strike a great balance out of the box, although having a smoother 30fps capped experience to start would go a long way IMO. The compromises for 60fps are just too much.

I also have no idea why the textures were slashed so much. Maybe on Series S there's always a memory argument but in general virtual texturing works very well...
On my 6 GB VRAM GPU, the textures also look like that, they never load, regardless what setting or resolution.

By the way, something appears to be wrong with UE5's volumetric cloud system. I've had buggy and flickery artifacting clouds in Hellblade 2 and I have them again in Ark. They flicker and artifact like crazy and it's very, very unpleasant to the eye. It's not clear what upscaler Ark uses, since there's no setting for it, but it might be an issue with upscaling and the volumetric cloud system in UE5. This is medium settings, 1440p and a render resolution of 50% with the "innate"upscaler (probably low translation from German, it might refer to TSR?). I really, really don't think the clouds should look like that.
flicker.png

And now imagine the most horrible flickering you can ever imagine.

Here's the same issue in Hellblade 2 using DLSS at the same render resolution. (Low-Medium settings)

Pixels.png

I don't have much time to download a 20 GB for Fortnite right now, but if my memory serves correctly, I've also noticed buggy and artifacting clouds there as well. (Although take this with a grain of salt, I might verify this at a later date). But observing the exact same issue in 2 recent UE5 games already, I think it's a clue that points toward this issue being engine related.
 
On my 6 GB VRAM GPU, the textures also look like that, they never load, regardless what setting or resolution.

By the way, something appears to be wrong with UE5's volumetric cloud system. I've had buggy and flickery artifacting clouds in Hellblade 2 and I have them again in Ark. They flicker and artifact like crazy and it's very, very unpleasant to the eye. It's not clear what upscaler Ark uses, since there's no setting for it, but it might be an issue with upscaling and the volumetric cloud system in UE5. This is medium settings, 1440p and a render resolution of 50% with the "innate"upscaler (probably low translation from German, it might refer to TSR?). I really, really don't think the clouds should look like that.
View attachment 11464

And now imagine the most horrible flickering you can ever imagine.

Here's the same issue in Hellblade 2 using DLSS at the same render resolution. (Low-Medium settings)

View attachment 11465

I don't have much time to download a 20 GB for Fortnite right now, but if my memory serves correctly, I've also noticed buggy and artifacting clouds there as well. (Although take this with a grain of salt, I might verify this at a later date). But observing the exact same issue in 2 recent UE5 games already, I think it's a clue that points toward this issue being engine related.
I played some Fortnite yesterday on PS5 and I can confirm some amount of artifacting there as well.
 
It's not clear what upscaler Ark uses, since there's no setting for it, but it might be an issue with upscaling and the volumetric cloud system in UE5. This is medium settings, 1440p and a render resolution of 50% with the "innate"upscaler (probably low translation from German, it might refer to TSR?). I really, really don't think the clouds should look like that.
There's definitely upscaler settings on PC, it's just spread between two tabs in the settings IIRC. On console I assume it's TSR.

1718379522595.png

Regarding the artifacts it's probably just sampling resolutions falling low enough that the reconstruction starts to have a lot more trouble. If you leave it on medium but bump the internal res back to native or something (DLAA or 100% TSR scaling) do they reduce/mitigate to be more similar to the look of High/Epic?

Unfortunately with a number of these stochastic effects there is kind of a critical breaking point beyond which there just aren't really enough samples to reconstruct reasonably, exacerbated by motion. SMRT/VSMs run into similar issues if the settings are pushed too low and/or the contrast is too extreme.
 
Ha, this is a pretty great practical graph of the bandwidth differences in the VRAM/PCIe test :D Could not have planned those squares/volumes out more perfectly.

1718380866635.png

The VRAM meters in these games are I think still a net positive, but should probably come with a lot of *'s about how approximate they are. Even if the game could somehow be perfectly accurate to its own use over the full experience (which is impossible for a variety of reasons, one of which is that both the OS and driver get votes that change over time), the app can't predict any other uses of VRAM and decisions from the OS about VRAM sharing between applications, especially on multi-monitor systems. A healthy buffer is probably always going to be needed on PC, similar to dynamic resolution.
 
Last edited:
@Dictator : You guys looked at this?


Independent frame interpolation utility on Steam. Cyan's the only user reporting but it sounds really good.
 
@Dictator : You guys looked at this?


Independent frame interpolation utility on Steam. Cyan's the only user reporting but it sounds really good.
I actually was gonna Look at it soon
 
0:02:29 News 01: Star Wars Outlaws demo reaction!
0:28:42 News 02: Assassin’s Creed Shadows unveiled
0:46:55 News 03: Clarifying the Gears of War cinematic trailer
0:55:05 News 04: Epic Games Store leaks upcoming games
1:01:05 News 05: Riven demo released
1:08:59 Supporter Q1: Has any console generation been more boring than this one?
1:13:43 Supporter Q2: Was Microsoft showing Series X or PC footage at their games showcase?
1:20:20 Supporter Q3: Is delaying a physical release the best option for modern games?
1:27:06 Supporter Q4: Will you make an updated PC Gaming on a Budget video?
1:34:32 Supporter Q5: Do you have ambitions for growing your audience?

 
About the Gears of War: E-Day trailer discussion.. I'm going to come out and say it.. Outside of image quality and visual stability... I bet you that's what the game looks like. I don't think the destruction animations or anything is outside of the realm of real-time rendering, considering how they've handled big destruction sequences like that in Gears 5. They showing this trailer, but they're talking about 100x the character and environment detail and vastly improved animations and so on for a reason. UE5.4 gives them so much more performance potential.. I don't even think it's fair to assume the quality given what we know they were capable of with UE4. This game is going to blow people away for sure.
 
I think the visuals will be quite a bit worse, particularly during gameplay. I suspect this trailer will compare to the final game about as well as the BF 2042 in engine reveal trailer, which devs also claimed used in game assets.
 
Back
Top