Digital Foundry Article Technical Discussion [2024]

I think we can say that the management of Nanite-like micropolygon systems was a mistake in this generation. UE5 et al are not for these hardwares.

I'm sure that Indiana Jones, for example, will show much better image quality with 60FPS and will have detailed, beautiful graphics. IDTech engine fits better with current generation consoles
Indiana Jones is also a big step down in terms of asset quality and lighting compared to UE5 Lume+ Nanite driven titles.

I'd rather have true current gen lighting and better geometry than a higher resolution. I think the current compromises on consoles are just fine.
 
Indiana Jones is also a big step down in terms of asset quality and lighting compared to UE5 Lume+ Nanite driven titles.

I'd rather have true current gen lighting and better geometry than a higher resolution. I think the current compromises on consoles are just fine.
More detailed graphics are useless if the glasses you're viewing the graphics through aren't good enough.

These consoles should be capable of 1440p resolution with well-adjusted FSR at 60FPS. If you watch the picture on a large TV, this is the minimum.

This should be the basis and use engines that meet this basic image quality requirement. You can create fairly detailed graphics even without Nanite, there is an example of this in this generation.
 
I think we can say that the management of Nanite-like micropolygon systems was a mistake in this generation. UE5 et al are not for these hardwares.

I'm sure that Indiana Jones, for example, will show much better image quality with 60FPS and will have detailed, beautiful graphics. IDTech engine fits better with current generation consoles
Thing about that is that these games often have dynamic res and can/will scale up on future hardware. That's pretty important. Also, nobody is being forced to use Nanite or Lumen. UE5 has enough improvements outside of those features where you can confidently say that UE5 is better for these consoles than UE4 is. It's the developers who need to make smarter decisions for their games.

That.. and well..it's probably better devs get started utilizing these new features and adjust their workflows now instead of later.
 
I think we can say that the management of Nanite-like micropolygon systems was a mistake in this generation. UE5 et al are not for these hardwares.

I'm sure that Indiana Jones, for example, will show much better image quality with 60FPS and will have detailed, beautiful graphics. IDTech engine fits better with current generation consoles
Ironic that you would post this since UE5's state of the art GPU-driven renderer (Nanite) was closer to John Carmack's (one of the original creators of id Tech) vision of future advances in rendering since he too wanted to sought out a solution to the unresolved problem of virtualizing geometry ...
 
Yes, Carmack's pioneering ideas about the virtualization of geometry will also be realized, one of the proofs of this is Nanite. However, the technology used in IDtech7 is probably better suited to current generation consoles. The main proof of this is the image resolution, which is much closer to 4K in the ID engine (and in several other more traditional game engines) than in the geometrically overdriven UE5.

I have no doubt that UE5 and Nanite are the future, but this technology does not perform well on current consoles. A render resolution below 1080p results in unacceptable image quality despite sharpening. Newer multiplatform games show this well.
 
I think we can say that the management of Nanite-like micropolygon systems was a mistake in this generation. UE5 et al are not for these hardwares.

I'm sure that Indiana Jones, for example, will show much better image quality with 60FPS and will have detailed, beautiful graphics. IDTech engine fits better with current generation consoles
UE5 and Nanite was meant to run on current-gen hardware - at 1080P 30FPS with lighting provided by Lumen and VSMs on their lowest settings. It achieved that goal. At the time UE5 development began, 30 FPS had been standard for three console generations, consumers put up with a measly resolution increase from 720P to 900P from the 7th to 8th gen consoles, 4K TVs weren't standard yet, and HWRT wasn't on the horizon yet. Thus 1080P 30FPS without HWRT was a perfectly reasonable goal.
 
The pressure added to developing a game that's 60fps now (as a result of the long cross gen period) is also not helping UE5 on current gen consoles

Any game that isn't 60fps or doesn't have a 60fps mode gets a right battering on social media, but the people screaming about it don't understand that UE5 + 60fps + console = Blurry ass IQ.
 
Last edited:
UE5 and Nanite was meant to run on current-gen hardware - at 1080P 30FPS with lighting provided by Lumen and VSMs on their lowest settings. It achieved that goal. At the time UE5 development began, 30 FPS had been standard for three console generations, consumers put up with a measly resolution increase from 720P to 900P from the 7th to 8th gen consoles, 4K TVs weren't standard yet, and HWRT wasn't on the horizon yet. Thus 1080P 30FPS without HWRT was a perfectly reasonable goal.
Wasn't the target 1440p 30 with hardware lumen and 1080p 60 with software lumen?

And most PS4 games for most of it's life were 1080p, if PS4 had a decent CPU we would have had 60 fps in most games. That bandwidth is still good.
 
Yes, Carmack's pioneering ideas about the virtualization of geometry will also be realized, one of the proofs of this is Nanite. However, the technology used in IDtech7 is probably better suited to current generation consoles. The main proof of this is the image resolution, which is much closer to 4K in the ID engine (and in several other more traditional game engines) than in the geometrically overdriven UE5.

I have no doubt that UE5 and Nanite are the future, but this technology does not perform well on current consoles. A render resolution below 1080p results in unacceptable image quality despite sharpening. Newer multiplatform games show this well.
I have yet to see their upcoming results/advances with Indiana Jones but maybe id Tech has a higher performance profile because it aims for a lower graphical fidelity ceiling ? UE5's Nanite Virtualized Geometry by comparison is a fully GPU-driven renderer w/ a hybridized deferred texturing/shading approach which sports high performance compatibility for complex material systems like Substrate ...

Id Tech as far as I know is mostly a clustered forward renderer (highly common in mobile graphics) that's not completely GPU-driven and they ban shader/material graphs too which prevents the issue combinatorial explosions of many PSO combinatorial explosion seen in PC releases but it's also a design that prevents the compiler/drivers from applying inlining optimizations as well which leaves performance on the table ...

It's clear to anyone that UE5 has a higher performance overhead than something like id Tech because it has graphically more powerful features that allows for more flexible workflow systems such as material graphs and it's sub-mesh LoD clustering technique, no ? 2x geometry passes for the Visibility/G-buffer and it's many associated many rendering passes which consumes memory bandwidth due to repeated r/w accesses to the G-buffer isn't going to be super fast in terms of performance but that's a strategy that works for THEIR systems ...

I wonder what id Tech's rendering pipeline architecture would look like if they tried to support these very same graphical features too ?
 
UE5 and Nanite was meant to run on current-gen hardware - at 1080P 30FPS with lighting provided by Lumen and VSMs on their lowest settings. It achieved that goal. At the time UE5 development began, 30 FPS had been standard for three console generations, consumers put up with a measly resolution increase from 720P to 900P from the 7th to 8th gen consoles, 4K TVs weren't standard yet, and HWRT wasn't on the horizon yet. Thus 1080P 30FPS without HWRT was a perfectly reasonable goal.
It may be that if we look at it technically, 1080p/30FPS is suitable as a target result, but only if we look at the original XOne and PS4 resolutions. But in the meantime, there was a PS4pro and an XOneX, which consoles brought 1440p and 4K native image quality, sometimes even with 60FPS!

We are used to higher and better image resolution and indeed 60FPS, due to the multiplatform crossgen games that appeared at the beginning of this generation. Thus, the resolution of the recently released games with barely 1080p image quality is not enough... blurry nonsens. I would prefer games with more traditional solutions but with a higher resolution. Current consoles are not suitable for UE5 in terms of visual display. I also expected that they would be able to get a higher resolution out of these hardware over time, but it seems that only Fortnite from the team that created UE5 and probably Gears The Coalition magic can do this. Two games... Not enough.

Beautiful games can be made for PS5/XBX in higher resolution, but only with engines that match this. However, since the current performance-demanding trends will not be reversed, hah.... the new Xbox can come next year, if I have to pay up to 1,000 euros for it!
 
I think we should at least wait for some current gen Id Tech 8 games before we draw conclusions. That being said, nanite isn't the performance hog, Lumen is.
 
Last edited:
I think we should at least wait for some Id Tech 7 games before we draw conclusions. That being said, nanite isn't the performance hog, Lumen is.
Nanite and other modern systems that pack hundreds of millions of micropolygons onto the screen are all performance-demanding, there have been games that only use Nanite, but the rendered resolution is low. This is even more true when used together with ray-tracing.

For modern micropolygon systems, RAM and bandwidth are probably one of the most determining factors.
 
Wasn't the target 1440p 30 with hardware lumen and 1080p 60 with software lumen?
From the Unreal documentation:
Lumen targets 30 and 60 frames per second (fps) on consoles with 8ms and 4ms frame budgets at 1080p for global illumination and reflections on opaque and translucent materials, and volumetric fog. The engine uses preconfigured Scalability settings to control Lumen's target FPS. The Epic scalability level targets 30 fps. The High scalability level targets 60 fps.

Lumen relies on Temporal Upsampling with Unreal Engine 5's Temporal Super Resolution (TSR) for 4k output. Lumen and other features use a lower internal resolution (1080p), which gives TSR the best final image quality. Otherwise, rendering these features at 4K natively would need lower quality settings to achieve 30 or 60 fps.
  • Cinematic scalability level targets Movie Render Queue.
  • Epic scalability level targets a 30 fps console budget.
  • High scalability level targets a 60 fps console budget.
  • Low and Medium scalability levels disable Lumen features.
So 1080P 30FPS on one preset, 1080P 60FPS on another preset, and higher resolutions can be achieved by lowering quality settings below the presets.
 
Oh, I remembered wrong. Anyways, let's wait for cdpr, the coalition or some Sony first party to really judge UE5. Before paragon, ue4 was terrible on console. Any time now...
UE4 has become OK on PS4, but at only 30fps (on PS4 Pro too). It's excellent on PS5 at 60fps and increased resolution. I fear that UE5 will never be great on PS5.
 
  • Like
Reactions: snc
UE4 has become OK on PS4, but at only 30fps (on PS4 Pro too). It's excellent on PS5 at 60fps and increased resolution. I fear that UE5 will never be great on PS5.
To be fair, most games on PS4 are CPU limited, almost no complex, ambitious game is 60 on it. Let's have some faith, I'm sure we won't have to wait too long for someone to use a recent version of UE targeting 1080p to 4k with TSR. The best practices to use with nanite and lumen are being experimented right now. It's just that games take so long to come out, that there aren't enough examples to even understand if there are improvements.
 
This gen, kind of like the ps3-360 gen, isn't equipped adequately for the developers ambitions.
Completely disagree. PS3 X360 gen was a lot better.
1) Consoles at start was on top PC level especially X360, not like PS5 XSX.
2) Graphics level before start of those conseles was so much lower, that new consoles was more than ready for next gen.
 
Completely disagree. PS3 X360 gen was a lot better.
1) Consoles at start was on top PC level especially X360, not like PS5 XSX.
2) Graphics level before start of those conseles was so much lower, that new consoles was more than ready for next gen.
And yet, games were usually running at 30 fps with frequent drops to 20 or even lower. Games that weren't using deferred rendering were running pretty well, and were also sharper with a higher resolution. That's the same situation we have right now with ray tracing, with the added problem of FSR being not great.
 
Back
Top