Digital Foundry Article Technical Discussion [2024]

That complaint doesn’t seem relevant here though. The CSMs are suffering from more artifacting and pop in than the RT stuff. So he’ll need to pick his poison. Static, clean shadows or noisy animated ones.
So turns out you can just enable VSMs via cvars (in the INI files) and they seem to work fine, at least in the first 15 minutes or so of the game I tested. They also seem to have very little if any impact on performance, to the level I'm able to eyeball it (the game doesn't seem to have a way to checkpoint a specific view or sequence). Additionally since they have apparently set up reasonable light source radii/angles (presumably for the raytracing path), you get reasonable soft shadows as well.

Maybe they break later in the game or something; otherwise I'm not really sure why they wouldn't at least expose them as an option since the quality improvement is pretty significant.

Screenshots
 
Last edited:
So turns out you can just enable VSMs via cvars (in the INI files) and they seem to work fine, at least in the first 15 minutes or so of the game I tested. They also seem to have very little if any impact on performance, to the level I'm able to eyeball it (the game doesn't seem to have a way to checkpoint a specific view or sequence). Additionally since they have apparently set up reasonable like radii (presumably for the raytracing path), you get reasonable soft shadows as well.

Maybe they break later in the game or something; otherwise I'm not really sure why they wouldn't at least expose them as an option since the quality improvement is pretty significant.

Screenshots

Fantastic! Hopefully there's nothing game breaking. Low rate animation and shadow LOD pop are the two visual things that for some reason just dig under my skin. I'm sure I'll get this game eventually, and it'd be nice to know there's a high quality shadow option.
 
Fantastic! Hopefully there's nothing game breaking. Low rate animation and shadow LOD pop are the two visual things that for some reason just dig under my skin. I'm sure I'll get this game eventually, and it'd be nice to know there's a high quality shadow option.
You still get a bit of pop from foliage since it appears to be non-nanite and thus will still pop LODs (in both primary view and shadows), but it's certainly much better than the CSM path which has both that, and excessive blur and flicker due to low resolution. Performance aside, RT shadows are probably still the best option here as they likely just... don't do any LOD on the foliage, although you could force something similar for VSMs via other cvars if you wanted as well.

But yeah the perf hit for the RT path is pretty heavy right now.
 
Kinda inevitable during the transition period right? HWUB made an interesting observation. Medium RT wasn’t much more expensive than Cinematic Lumen and produced arguably better results. Hopefully we’ll get more opportunities to make those types of direct comparisons as we march along toward that unified RT nirvana.
I'm not sure there is a 'transition period' and this might be as good as it gets, RT ending up adding quality options and far more complications and work for devs.
 
So… I did some in-depth dive into one frame of BlackMyth using RenderDoc, the findings are pretty unexpected. Everything I talked about is based on Cinematic Quality on a rtx 3080 with 2k output rendered at 50% up scaled by FSR2. No raytracing.
Here to confirm a few points already discussed in the video:
1. Yes Vegs and Terrains don’t use Nanite and they are super high poly. I spotted a small tree with 130k vertices. Also they are rasterized multiple times in the frame (they have a full prez for none nanite objects, and extra draw calls for cascaded shadow maps)
2. Yes they are using CSM. More specifically, your standard ue4’s 5-cascaded setup. Each cascade has a resolution of 2048, and both nanite and non-nanite meshes get rendered into each cascades.
3. So obviously there’s also the legacy gpu occlusion culling readback system for non-nanite objects.

What really upsets me is that they actually don’t use Lumen when RT is disabled… I’ll get back and explain how their GI work when I got time. But it’s kinda naive and brutal, not really next gen.
I’m not 100% sure about this, but I’m 99% sure
 
So… I did some in-depth dive into one frame of BlackMyth using RenderDoc, the findings are pretty unexpected. Everything I talked about is based on Cinematic Quality on a rtx 3080 with 2k output rendered at 50% up scaled by FSR2. No raytracing.
Here to confirm a few points already discussed in the video:
1. Yes Vegs and Terrains don’t use Nanite and they are super high poly. I spotted a small tree with 130k vertices. Also they are rasterized multiple times in the frame (they have a full prez for none nanite objects, and extra draw calls for cascaded shadow maps)
2. Yes they are using CSM. More specifically, your standard ue4’s 5-cascaded setup. Each cascade has a resolution of 2048, and both nanite and non-nanite meshes get rendered into each cascades.
3. So obviously there’s also the legacy gpu occlusion culling readback system for non-nanite objects.

What really upsets me is that they actually don’t use Lumen when RT is disabled… I’ll get back and explain how their GI work when I got time. But it’s kinda naive and brutal, not really next gen.
I’m not 100% sure about this, but I’m 99% sure
Pretty obvious SDF-like reflections here, Not from Lumen?

Even the quality knob for GI functions similar in visual results to what 5.0 quality knobs did. Very curious to know what is up!
 
Last edited by a moderator:
Surely if you're going to use frame gen to hit 60fps, wouldn't it be better to enable it on the quality mode where the input resolution is higher quality, this the final image would be better 🤷‍♂️
 
Looks like the devs focused primary on the PC version for this. The relative platform sales would be interesting to see.
It's sold gangbusters on Steam with >2 million concurrent players. 88% of the players are estimated Chinese. Ergo, I expect console sells to be pretty muted by comparison. It was a game made for the Chinese market with a Western release, I think, and consoles is an added extra from a Chinese dev wanting to branch out and reach Western audiences. Although curiously if sales are that nationally dominated, there's a strong economic argument to not even bother.

It does highlight the impact of developer experience. I guess Wukong being a landmark UE5 title isn't really it. We need an established dev with AAA experience to push it. I mean, it still looks good which shows something, but it's also sounding not well optimised and as such, UE5 is capable of more.
 
Consoles didn't even exist officially in china until they unbanned them a few years ago, so it's not a surprise that the developer would concentrate their efforts on the PC.

The Frankenstein PC is destroying the PS5 in this game 😅

I was playing the game on PS5 and thankfully variable framerates don't bother me too much, and I find the input lag pretty low (tested it by eye on the jump, looked almost instantaneous to my eye, but maybe it's not the best method to test that).

I'm sure we'll see much better results soon from more experienced developers both on UE and other advanced engines.
 
Regarding 45fps caps, the first Condemned on Xbox 360 had a 45fps framerate cap.
Also, both that game and Return to Arkham actually do work out fine (conditionally) on Xbox One X and up, contrary to Oliver's recollection.
The Xbox One S/X machines both used Freesync 2 (eventually renamed to Freesync Premium), which had a 40-60fps VRR window, so 45fps works if you have the appropriate display.
 
Regarding 45fps caps, the first Condemned on Xbox 360 had a 45fps framerate cap.
Also, both that game and Return to Arkham actually do work out fine (conditionally) on Xbox One X and up, contrary to Oliver's recollection.
The Xbox One S/X machines both used Freesync 2 (eventually renamed to Freesync Premium), which had a 40-60fps VRR window, so 45fps works if you have the appropriate display.

In 2018, when the Return to Arkham One X patch came out, FreeSync TVs were very rare. I think the first ones had come out that very same year. I get new TVs all the time and I didn’t have a set with VRR support until 2021.

Generally I was describing how the game would have ran on the overwhelming majority of setups at the time of release.

Edit: looking it up now, it looks like it was just higher-end Samsung models from 2018. And of course those televisions would need to be set up correctly for VRR gaming as well.
 
Last edited:
I only use monitors. I have a 4K Acer monitor from 2020 that had Freesync 2/Premium. Didn't know the patch came out so long before that.
Although I've tried that game specifically with just 120hz (no VRR) mode on One X and that also helps with the visual appearance. In fact, the entire reason I bought a 120hz monitor was because of Richard's DF video about it here.
 
Freesync over HDMI is a propriety solution to VRR and not part of the HDMI standard. This is why it's very rare on TVs and even uncommon on monitors. VRR on TVs didn't become widely supported (well somewhat) until HDMI 2.1 launched and HDMI 2.1 TVs started releasing, as HDMI 2.1 included VRR as part of it's standard. HDMI did not have any inherent support for VRR prior to 2.1.
 
Pretty obvious SDF-like reflections here, Not from Lumen?

Even the quality knob for GI functions similar in visual results to what 5.0 quality knobs did. Very curious to know what is up!
Actually you are right. There should be lumen.
Long story short, I feel odd about not having lumen showing up in all the frames I captured via RDC, so I contacted my friend at GS today. I got the confirmation that Lumen is indeed in use if GI is set to cinematic without Raytracing. It took us sometime to figure out that there's actually a bug. When I switched the graphics toggle from Low to Cinematic, there's a chance that the actual settings don't get refreshed properly. So the whole render pipeline stays as what it was beforehand. Looks like I hit that 1% bad luck :(. Sorry for any improper hype that I have raised.

Anyways, I re-did the captures in a few more locations and here're the corrected findings:
A quick update on the shadowmap, this is a bit more adaptive that what I thought. Its configuration gets tweaked on an area based. Later in the game I found 3-cascaded setups (and this is on Cinematic) in contrast to the 5-cascaded setup earlier.
Snipaste_2024-08-23_00-58-14.png

Fluids seems to be simulated via Niagara.

Back to the GI. Previously I said it was "brute force" from what i've seen in the frame capture. Now, I'm notified that this was the fallback solution to Nanite when GI is set to LOW. Essentially, the contribution of the GI is from a global Skybox and a local cubemap. In this case, the global Skybox for the first chapter is a HDR cubemap picked from Unreal's HDR Asset Database (recognized by one of my friends), so it's not really reflecting the actual environment. Though, the local cubemap is a real capture of the surroundings.

QQ图片20240823004642.pngSnipaste_2024-08-23_00-44-24.png

That's why I spotted blue hues on the bottom of a stone sitting on the grasslands. It's your typical light leaking artifact.
Snipaste_2024-08-23_00-49-34.png

They did to try to reduce the leaking problem in Low GI quality. First off is the classic SSAO, running at full res but not so high sample count (remapped here to increase contrast) . Nothing special.
Snipaste_2024-08-23_00-48-09.png
Then there's the DFAO which was what I saw last night. It's initially traced at 1/64 res and then both temporal and spatially filtered and upscaled to 1/4 of the internal res.
Snipaste_2024-08-23_00-52-36.pngSnipaste_2024-08-23_00-51-54.png

(All the res scale I mentioned is the final scale multiplying both axis)
Then, on the actual cinematic mode, Lumen is turned on, and you can clearly see every step of Lumen in the capture. From placing probes to generating surface caches, and etc. In short, the quality is reasonably high. Probes are placed uniformly at 1/16 res with adaptive placement on depth discontinuities. Surface caches have a 4096x4096 atlas.

1724346104644.png

Later the GI is gathered at full res, outputting diffuse gi and translucent gi.
However, that random HDR cubemap is still present ;), even in Lumen as the backup for skybox.

1724346131117.png1724346151496.png

Think that pretty sums up my findings. Most of the options seem to be straight from the early versions of UE5 ignoring the raytracing part. I was more curious about how UE5 performs under what graphics levels. Still very demanding.
 
Haven't tired the game yet but I'm legitimately confused by your statement here... the videos on that page look pretty bad quality-wise, certainly not comparable to the PS5 ones. Furthermore the "performance" quoted there is with frame generation, which isn't comparable with base/real performance on PS5 or with it off.

I mean at some level it's cool that you can play the game at all on these machines, but it's clearly not in the same league as the early PS5 videos I've seen, right? What am I missing?
You are probably missing wishful thinking that “consoles are bad, even the steam deck has a better cpu, consoles have hit their limits and will be 30fps from now on”
 
Back
Top