Digital Foundry Article Technical Discussion [2024]

That's incorrect, Nanite is used everywhere and on literally everything in Black Myth (trees, rocks, mountains, caves, houses, statues, probs, even the terrain). Vegetations and foliage are not using Nanite, but that's not the fault of the game, 99% of UE5 titles still don't use Nanite for vegetation.
They do not use Nanite for the terrain at all as you inaccurately implicate. That's effectively much of the in-game's level geometry (vegetation/terrain/skinned meshes) that doesn't have to come under fire from the complexity of a micropolygon sub-mesh cluster renderer when doing a HWRT implementation ...
 
I suspect that the launch of next-gen will look a lot like the launch of this gen. The Nanite+Lumen (or equivalents in other engines) games that struggle on this generation will be running at 1440P 60fps on next-gen, and there will be a lengthy cross-gen period for consumers to get used to that performance level. Then once the cross-gen period is over devs will start implementing ReSTIR GI or something similar and performance is back to 720P60fps and 1080P30fps. At least the consoles will have better upscaling solutions.
 
They do not use Nanite for the terrain at all as you inaccurately implicate. That's effectively much of the in-game's level geometry (vegetation/terrain/skinned meshes)
As per Digital Foundry, terrain is Nanite (timestamped). You claimed most of the scenes are not Nanite which is just incorrect, you also failed to provide evidence that the game is avoiding foliage Nanite, as 99% of UE5 games are not using it for vegetation or even terrain.

That's effectively much of the in-game's level geometry
Which makes your statement very weird indeed, this means 99% of UE5 titles are not using Nanite in most scenes as well.

 
Last edited:
1. The Legendary G80 was released full year later than The Legendary Xbox 360.

True, but the legendary X1900XTX was launched only 2 months after the Xbox 360 and significantly exceeded it in almost every way. It's true though that at launch and for the first couple of months the 360 could go toe to toe with the highest end PC hardware on the market in several respects.

And of we combine Cell + RSX, performance was close.

Nah it really wasn't. Cell + RSX was generally enough to keep up with Xenos. G80 was comfortably double that. And if you were to assign all that SPU power to propping up GPU performance in PS3, what kind on anaemic CPU would you be left with that would have to compete with the likes of AthlonX2's and Core2's on the PC side?
 
I think some people are forgetting just how fast performance moved on PC back in the latter half of the 2000's.

It was a crazy time for PC hardware, DX10, RV770, G80, G92 (❤️) Core 2 introduced, first gen i7, Phenom 2 X6, SSD's......

Like 2005 to 2010 was bat shit crazy for PC hardware development, and it's my personal favourite period of PC hardware.
 
The X360/PS3 generation was no better than the current generation 3-4 years after its release. First of all, the resolution was much lower on the console at the time compared to the PC, don't forget this! About 650p vs 1080p. And the aliasing was generally dewy. Memories become beautiful...
3-4 years that gen consoles was a lot better. There was tons of games released with amazing level of graphics, especialy on PS3. Uncharted, Uncharted 2, Ratchet and Clank A Crack in Time (60fps), Resistance 2, Killzone 2, Gears of War 2, Forza Motorsport 3 (60 fps), to name a few. And all run in 720p. Now there is almost 4 years of this gen and looks like developers just started to scratch true potential. Almost all games is enhanced PS4 ports. Yet still most of then can't run in 4k and 60 fps. I understand what are you trying to say, but for me those times was a lot more fun and interesting.
The XSX/PS5 got powerful hardware, and this showed in their first years: many true 4K resolutions, many more games running at 60 FPS, mostly PC high settings, good AA.
In PS4 ports yes. But even Sony exclusives wasn't wble to run 60 fps in higher than 1440p.

That's impossible, the 8800GTX was at least 2x more powerful than any hardware out there, consoles or otherwise.
How much gflops was G80 and how much Cell+RSX? ~400 and ~400. Of course G80 was more advanced and easier to be programmed.
I speak about raw performance wnd point is what PS5 and XSX if more far from top. This is why also thise consoles dated worse than PS3 X360 for their time.

You sure?
Yes, and even in those tests fps is not 20. Yes ok, it's not stable 30, but not 20. And there were 20 fps games on PS3 and X360 of course. But were also very good optimised games.
 
True, but the legendary X1900XTX was launched only 2 months after the Xbox 360 and significantly exceeded it in almost every way. It's true though that at launch and for the first couple of months the 360 could go toe to toe with the highest end PC hardware on the market in several respects.
Yes, but in long terms no. X1900XTX wasn't able to run DX11 games.
Nah it really wasn't. Cell + RSX was generally enough to keep up with Xenos. G80 was comfortably double that. And if you were to assign all that SPU power to propping up GPU performance in PS3, what kind on anaemic CPU would you be left with that would have to compete with the likes of AthlonX2's and Core2's on the PC side?
Can't agree on that. X360 was my main console that gen, (170 completed games vs 50 on PS3), but I must admit what Sony exclusives was ahead of MS exclusives in each year. It was like this year Sony ahead, next year MS reach same level, but Sony again surpass, and same situation till the end of generation with The Last of Us as a tital winner. Sony exclusives were most brilliant technicaly and not only in that, but that is different story. MS exclusives were very good to and often far ahead of multiplatform games. At the end of genereation, then there were cross generation games, some of those games even surpassed MS exclusives, (Metro Last Light, Crysis 3, Alien Isolation, Rise of the Tomb Raider), but they, again in my opinion was behind Sony exclusives from last games wave.

I think some people are forgetting just how fast performance moved on PC back in the latter half of the 2000's.

It was a crazy time for PC hardware, DX10, RV770, G80, G92 (❤️) Core 2 introduced, first gen i7, Phenom 2 X6, SSD's......

Like 2005 to 2010 was bat shit crazy for PC hardware development, and it's my personal favourite period of PC hardware.
Completely true. This is also why I like PS3 and X360 very much. In my opinion thise consoles holded very well against PC and for long time. I played on those consoles, mostly on X360 till the end of 2015 and was completely pleased with them. I expected from Xbox One to be 5 TF machine, PC had same power GPUs for like year before Xbox One release. And I was very disapointed when MS told about specs, Sony also. I was disapointed so much, what continued to play on old gen consoles for 2 more years.
 
Completely true. This is also why I like PS3 and X360 very much. In my opinion thise consoles holded very well against PC and for long time.

They really didn't...... They were abysmal after a few years.

I was full blown PC gaming during that time, and the image quality they offered compared to PC was shockingly bad.

And especially in 2007, when a little unknown game called Crysis released, it made them look so dated it wasn't even funny.
 
This all is very subjective. For me big difference is when difference is like between Gran Turismo 4 and Gran Turismo 5. I personaly was more blown away when I played Far Cry Instincts on original Xbox than from original Far Cry on my PC. And that PC had some 5 times more performance than original Xbox. And run Far Cry on low settings and low fps. :) That was real shock. I understand what top PC always run games in super duper quality, but that can't impress me. Maybe because I know how much performance top PC have. But console is other thing. It always amazes me to play every game on console what makes leap in terms of graphics. Even if that leap is only in consoles boundaries. And PS3 X370 gen delivered that feeling many times, every year there was some games what were better than last year.
 
The pace of advancements during that time was ridiculous. Every year the mid range GPU had the performance of the flagship of the year before. The top tier GPU's were 500$/€.
It was such a great time.
 
As per Digital Foundry, terrain is Nanite (timestamped). You claimed most of the scenes are not Nanite which is just incorrect, you also failed to provide evidence that the game is avoiding foliage Nanite, as 99% of UE5 games are not using it for vegetation or even terrain.
A detailed RenderDoc account is more reliable data than pure anecdote with qualifers from a reviewer ...
Which makes your statement very weird indeed, this means 99% of UE5 titles are not using Nanite in most scenes as well.
The game doesn't even make use of VSM as well because testing the clipmaps against large primitives would involve high overhead due to reduced culling efficiency ...
 
A detailed RenderDoc account is more reliable data than pure anecdote with qualifers from a reviewer ...
I will wait the final judgment on that, that very same RenderDoc gave a false result for Lumen.

The game doesn't even make use of VSM as well because testing the clipmaps against large primitives would involve high overhead due to reduced culling efficiency ...
Irrelevant to our discussion here, your "theory" about the game not using Nanite due to Path Tracing is incorrect, the game is heavily using Nanite, it is not using foliage Nanite because the vast majority of UE5 titles are not using foliage Nanite or even terrain Nanite. That's the current state of UE5 titles so far.

As for Path Tracing and Nanite, I think we can all agree that the results shown so far are anything but bad. The performance is reasonable and is mostly inline with other path traced titles that don't rely on virtualized geometry. This is one more case of an actual application "facts on the ground" winning over semi educated guessworks.
 
Irrelevant to our discussion here, your "theory" about the game not using Nanite due to Path Tracing is incorrect, the game is heavily using Nanite, it is not using foliage Nanite because the vast majority of UE5 titles are not using foliage Nanite or even terrain Nanite. That's the current state of UE5 titles so far.
Isn't the reason why they aren't using nanites on foliage that they are using UE 5.0 while foliage support got added in 5.3?

What's the latest version of NVRTX supported in Unreal? 5.2?
 
I will wait the final judgment on that, that very same RenderDoc gave a false result for Lumen.
It's far easier to identify/verify Nanite usage in an API debugging tool than the same for Lumen. Renderdoc let's you easily see exactly *what* geometry is being written to the visibility buffer texture resource ...
Irrelevant to our discussion here, your "theory" about the game not using Nanite due to Path Tracing is incorrect, the game is heavily using Nanite, it is not using foliage Nanite because the vast majority of UE5 titles are not using foliage Nanite or even terrain Nanite. That's the current state of UE5 titles so far.

As for Path Tracing and Nanite, I think we can all agree that the results shown so far are anything but bad. The performance is reasonable and is mostly inline with other path traced titles that don't rely on virtualized geometry. This is one more case of an actual application "facts on the ground" winning over semi educated guessworks.
It's very much relevant to subject at hand because not using Nanite at it's full power means that you can't extract the maximum associated tight culling boundary benefits afforded for VSM ...

As far as results being "anything but bad", one of the creator's of UE's VSM technique here wasn't very impressed ...
 
Last edited:
It's very much relevant to subject at hand because not using Nanite at it's full power means that you can't extract the maximum associated tight culling boundary benefits afforded for VSM ...
Which is one of the drawbacks of the whole system to begin with, it lacks flexible customization, you have to use Nanite+Lumen+Virtual Shadows to have semi reasonable performance/visuals ratio, the tradeoff here is that you end up losing a huge chunk of performance already for outdated visuals/tech (screen space reflections, mixed world space + screen space global illumination and rasterized shadows with limited shadow casting capabilities).

That's why not everybody is going to use the whole system, that's why you will have games that use only Nanite, or use only Lumen and forgo the other features, or games pushing the boundaries elsewhere, depending on their needs.

As far as results being "anything but bad", one of the creator's of UE's VSM technique here wasn't very impressed ...
Actual objective numbers and visuals on the ground will always defy any "subjective" opinions or incorrect theories.

Isn't the reason why they aren't using nanites on foliage that they are using UE 5.0 while foliage support got added in 5.3?
Yep, it's a late addition, there are no games released with 5.3 except Fortnite.

What's the latest version of NVRTX supported in Unreal? 5.2?
It's 5.3 with RTXDI and ray traced caustics.

 
Last edited:
Which is one of the drawbacks of the whole system to begin with, it lacks flexible customization, you have to use Nanite+Lumen+Virtual Shadows to have semi reasonable performance, the tradeoff here is that you end up losing a huge chunk of performance already for outdated visuals/tech (screen space reflections, mixed world space + screen space global illumination and rasterized shadows with limited shadow casting capabilities).

That's why not everybody is going to use the whole system, that's why you will have games that use only Nanite, or use only Lumen and forgo the other features, or games pushing the boundaries elsewhere, depending on their needs.
Nvidia already admits to needing to do a bunch of screen space fixups and inaccurate hacks to prevent tons of visual artifacts appearing for their RTX branch of UE since they can't represent the exact geometry for Nanite meshes so your criticism against UE's default rendering systems isn't as much of a slam dunk as you think it is when they also produce mixed results as well ...
Actual objective numbers and visuals on the ground will always defy any "subjective" opinions or incorrect theories.
A DF assessment isn't some gospel like you always seem to believe it is since they make errors along the way too ...
 
Yep, it's a late addition, there are no games released with 5.3 except Fortnite.


It's 5.3 with RTXDI and ray traced caustics.
They locked in on 5.0 to get the game out as soon as possible. I get it, the game has a massive amount of content that would need to be retested-redone.

(There is another game made on 5.3, hellblade 2 😬)
 
They locked in on 5.0 to get the game out as soon as possible. I get it, the game has a massive amount of content that would need to be retested-redone.

That requires discipline. Good for them.

As far as results being "anything but bad", one of the creator's of UE's VSM technique here wasn't very impressed ...

Andrew raised some concerns about GI quality. Haven’t seen any complaints about the shadow quality. He mentioned RT shadows were static but that doesn’t seem accurate based on released footage.
 
Nvidia already admits to needing to do a bunch of screen space fixups and inaccurate hacks for their RTX branch of UE since they can't represent the exact geometry for Nanite meshes so your criticism against UE's default rendering systems isn't as much of a slam dunk as you think it is when they also produce mixed results as well ...
Rendering is all hacks, but even with NVIDIA's "very small" concessions, they are far better visually and technologically than what we have in stock UE5 right now. Black Myth renders ray traced caustics and particles reflections, I haven't seen anything close to that in any UE game.

A DF assessment isn't some gospel like you always seem to believe it is since they make errors along the way too ...
I am assessing the game performance with the outputted fps, I am also assessing the visuals with my eyes and through the eyes of other experts, what are you basing your statements on besides incorrect statements like "the game doesn't use nanite"?
 
Last edited:
Andrew raised some concerns about GI quality. Haven’t seen any complaints about the shadow quality. He mentioned RT shadows were static but that doesn’t seem accurate based on released footage.
The RT shadows could very well be combined with screen space shadows or some bias factor technique to reduce the potential artifacting of not using the real geometry in the acceleration structure ...
Rendering is all hacks, but even with NVIDIA's concessions, they are far better visually and technologically than what we have in stock UE5 right now. Black Myth renders ray traced caustics and particles reflections, I haven't seen anything close to that in any UE game.
Can anybody take Nvidia's claim that they're actually doing "path tracing" at that point with their RTX branch of UE5 if there's major inaccuracies like their need to do screen space hacks ?
I am assessing the game performance with the outputted fps, I am also assessing the visuals with my eyes and through the eyes of other experts, what are you basing your statements on besides incorrect statements like "the game doesn't use nanite"?
I never claimed that they were not using Nanite, just that they don't subscribe to it's full power ...
 
Back
Top