Upscaling Technology Has Become A Crutch

6800xt or 3080ti for 1440p/60fps using medium with upscaling? Excuse me but what?
Sigh, and it doesn't even support hardware ray tracing.

At any rate, you shouldn't be shocked, this game behaves in line with Fortnite, a game with relatively simple graphics but still requires high end hardware to run at native resolution.

It seems the combination of Lumen and Nanite has a heavy toll indeed as Andrew stated.
 
Last edited:
This happens every time there's a big change in rendering that has performance implications. There were people who thought Quake looked bad at the time because sprites looked cleaner. There were people who thought Virtua Fighter and Tekken looked worse than 2D fighters for the same reason. There were huge arguments of Half-Life 2 graphics vs Doom 3 graphics, because Half-Life 2 ran better and had sharper textures. Same with Painkiller at that time that ran very fast and had ultra sharp textures. This has always just been a thing. Some new tech comes along with big performance costs and some people are just going to prefer a low polygon tree with a nice bark texture slapped on top.
Exactly. Based on what has been said so far, UE5 also has another important feature, the facilitation of game development and the reduction of development times, thanks to the modern tools used in the engine. And this has a significant role in terms of the development of today's and future games.
 
You have to move with the times. lifelike detailed modeling, Mesh Shaders and billions of micro polygons require such resources.

However, don't forget that this game runs at 60fps on current consoles with Nanite lighting. If the TSR resolution and antialiasing are good enough, this game will be a good example of how UE5 looks good on today's consoles. A few days and it will be clear.
I’m more than ready to move on with the times. I look at Cyberpunk Path Traced and I’m like, I get it. Immortals of Aveum on the other hand? Absolutely not. I also do not expect this game to hold 60fps on console and I expect ps3 era resolutions before upscaling. Also imo, UE5 TSR is worse than FSR 2.2 or DLSS and I already don’t like those….
This happens every time there's a big change in rendering that has performance implications. There were people who thought Quake looked bad at the time because sprites looked cleaner. There were people who thought Virtua Fighter and Tekken looked worse than 2D fighters for the same reason. There were huge arguments of Half-Life 2 graphics vs Doom 3 graphics, because Half-Life 2 ran better and had sharper textures. Same with Painkiller at that time that ran very fast and had ultra sharp textures. This has always just been a thing. Some new tech comes along with big performance costs and some people are just going to prefer a low polygon tree with a nice bark texture slapped on top.
This is fairly reductive. I mean, change in rendering always brings performance implications but, when a 4090 is running at 40fps in remnant 2 4k with no raytracing, one has to really question the usefulness of that solution. At 1440p, it’s around 72ish fps with no upscaling. It just doesn’t make sense and a very strong argument could be made that nanite is not going to be defacto standard going forward due to its prohibitive cost. When will 4090 performance make it to the masses at the sub $300 price bracket so they can play remnant 2 at reasonable frame rates? 10 years? When most end users struggle to utilize the feature, is it really even useful?
 
Last edited:
Exactly. Based on what has been said so far, UE5 also has another important feature, the facilitation of game development and the reduction of development times, thanks to the modern tools used in the engine. And this has a significant role in terms of the development of today's and future games.
I’d argue that tools and artist work are not the main driver of long development times. Instead I’d argue that the main driver of high costs and time is self imposed scope driven by their ambition. Games have ballooned in scope and content at the expense of quality and most consumers are not asking for the increased scope(cough **Assassins Creed** cough). If you look at game completion rates, you’ll see that it’s a minority looking for this sort of scope. That of course is a discussion for another thread.
 
Last edited:
I’d argue that tools and artist work are not the main driver of long development times. Instead I’d argue that the main driver of high costs and time is self imposed scope driven by their ambition. Games have ballooned in scope and content at the expense of quality and most consumers are not asking for the increased scope(cough **Assassins Creed** cough). If you look at game completion rates, you’ll see that it’s a minority looking for this sort of scope. That of course is a discussion for another thread.
I think we should wait until more games are released with this engine, and then we can evaluate how it performs on consoles. We'll be smarter in a few days when Immortals of Aveum comes out.

Furthermore, I have no doubt that Senuas Saga and Fable will look fantastic on Xbox Series X, bringing UE5's technology to the masses.
 
I think we should wait until more games are released with this engine, and then we can evaluate how it performs on consoles. We'll be smarter in a few days when Immortals of Aveum comes out.

Furthermore, I have no doubt that Senuas Saga and Fable will look fantastic on Xbox Series X, bringing UE5's technology to the masses.
Isn't Playground modifying Forzatech for Fable?
 
Sigh, and it doesn't even support hardware ray tracing.

At any rate, you shouldn't be shocked, this game behaves in line with Fortnite, a game with relatively simple graphics but still requires high end hardware to run at native resolution.

It seems the combination of Lumen and Nanite has a heavy toll indeed as Andrew stated.

Fortnite is when you build a cutting edge geometry system with a software rasterizer, a dynamic global illumination system and a unified shadowing system that can handle hundreds of players wearing custom skins in an open world destroying and building structures, and someone calls it relatively simple. Relative to what? There's absolutely nothing simple about the tech or even the visuals in fortnite. It just doesn't chase a photo-real art style.

This is one of the worst cases it has to support:

Refresher on the visuals.

Nice breakdown of the tech as it's specifically applied to Fortnite. On console they leave performance on the table to support future community events that can demand much more than regular gameplay.

Not to mention users can create their own content.
 
Last edited:
I’m more than ready to move on with the times. I look at Cyberpunk Path Traced and I’m like, I get it. Immortals of Aveum on the other hand? Absolutely not. I also do not expect this game to hold 60fps on console and I expect ps3 era resolutions before upscaling. Also imo, UE5 TSR is worse than FSR 2.2 or DLSS and I already don’t like those….

This is fairly reductive. I mean, change in rendering always brings performance implications but, when a 4090 is running at 40fps in remnant 2 4k with no raytracing, one has to really question the usefulness of that solution. At 1440p, it’s around 72ish fps with no upscaling. It just doesn’t make sense and a very strong argument could be made that nanite is not going to be defacto standard going forward due to its prohibitive cost. When will 4090 performance make it to the masses at the sub $300 price bracket so they can play remnant 2 at reasonable frame rates? 10 years? When most end users struggle to utilize the feature, is it really even useful?

Remnant 2 did end up patching performance and it seemed to range from 30-50% increases on the gpu side, at least in gpu-limited areas that I tested. I find with DLSS I'm frequently cpu-limited now. Not sure what the reason is. The discussion around Remnant 2 is unfortunate because in my limited time it's been very good. There may be some truth to nanite not being suitable for teams that really can't take advantage of the complexity the geometry system allows. Remnant 2 is by no means a bad looking game, and I think nanite does actually shine in parts that I've seen, it's just that the overall presentation is not AAA because it's not a AAA game. Overall it's on track to be AAA in terms of gameplay, as was Remnant from the Ashes. It's challenging and has a large variety of boss fights and puzzles. There seems to be secrets with branching options so you can do different things in different play throughs. The itemization is very good, giving a lot of build flexibility. Probably will end up being one of my best games of the year, if it stays on track.
 
There's absolutely nothing simple about the tech or even the visuals in fortnite. It just doesn't chase a photo-real art style.
I agree, it has state of the arts systems, however, you should remember that the game was released many many years ago (in 2017), most of these other systems you mentioned are already there since day one, it only recieved Lumen, Nanite and VSMs recently.

The game has simple animations, basic physics and destruction, average looking particles, the geometry of dynamic objects such as characters, weapons and vehicles is also average. It has no advanced physics or hair simulations. This is why I called it relatively simple compared to something like Last of Us Part 1 or Uncharted .. etc. Fortnite is simple by design as it's a massive multiplayer shooter that needed to be relatively lightweight to accommodate all the crazy action.

Yet despite that, you lose huge chunks of performance whenever the game transitions to the new features, first you lose fps when you enable DX12, then you lose more when Nanite is enabled, then Lumen then VSMs, in the end you end up losing 3 times your fps in the process when moving from max DX11 to max DX12. A 4070Ti went from 217 fps at 1080p to 78fps. Imagine that. A 3090 class GPU doing 78fps at 1080p.


Which is why I am saying that this Immortals game is not behaving out of the ordinary when compared to Fortnite. That's how things behave now with UE5, people shouldn't really be shocked.
 
Last edited:
A 3090 class GPU doing 78fps at 1080p.
I've said this before, but I really feel like the long ps4/xb1 generation broke a lot of people's expectations. A game using the most cutting edge tech runs at 80fps on a high end gpu? that's been the norm as long as there have been gpus.

There was a recent blip where there were very few games using cutting edge tech, so gpus got to crank huge frame rates on 2014 era games.
 
I agree, it has state of the arts systems, however, you should remember that the game was released many many years ago (in 2017), most of these other systems you mentioned are already there since day one, it only recieved Lumen, Nanite and VSMs recently.

The game has simple animations, basic physics and destruction, average looking particles, the geometry of dynamic objects such as characters, weapons and vehicles is also average. It has no advanced physics or hair simulations. This is why I called it relatively simple compared to something like Last of Us Part 1 or Uncharted .. etc. Fortnite is simple by design as it's a massive multiplayer shooter that needed to be relatively lightweight to accommodate all the crazy action.

Yet despite that, you lose huge chunks of performance whenever the game transitions to the new features, first you lose fps when you enable DX12, then you lose more when Nanite is enabled, then Lumen then VSMs, in the end you end up losing 3 times your fps in the process when moving from max DX11 to max DX12. A 4070Ti went from 217 fps at 1080p to 78fps. Imagine that. A 3090 class GPU doing 78fps at 1080p.


Which is why I am saying that this Immortals game is not behaving out of the ordinary when compared to Fortnite. That's how things behave now with UE5, people shouldn't really be shocked.

We're just going to have to agree to disagree on Fortnite. The idea that it's "relatively simple" or "simple by design" considering the massive amount of tech and engineering hours that have been dumped into it to actually make it work doesn't make any sense. Fortnite has continually been a testing ground for new UE features.
 
I've said this before, but I really feel like the long ps4/xb1 generation broke a lot of people's expectations. A game using the most cutting edge tech runs at 80fps on a high end gpu? that's been the norm as long as there have been gpus.

There was a recent blip where there were very few games using cutting edge tech, so gpus got to crank huge frame rates on 2014 era games.
Maybe but the performance has been cut to 1/3rd of its original value while failing to deliver visuals that are 3x better. So far, UE5’s pitch hasn’t been about the end user, it’s been about saving developers/publishers money. The end user is getting shafted in the process. They must upgrade their hardware to get marginally better visuals?

Devs got really good at faking it using traditional raster methods. Baked GI, SSR, Planar reflections, Real time cube maps, pom, etc. Now when the end user sees the raster end product compared to the Ray-traced “proper” end product, the first thing they notice is the significant loss in performance.

In many instances, end users claim that they prefer the end product of the raster version. I think Epic and devs need ensure that their ambition doesn’t stray from the realm of practicality for the end user. As it’s stands for me right now, UE5 is at best disappointing. Great new features but the cost of said features is way too impractical. It is in need of some serious optimization.
 
Maybe but the performance has been cut to 1/3rd of its original value while failing to deliver visuals that are 3x better. So far, UE5’s pitch hasn’t been about the end user, it’s been about saving developers/publishers money. The end user is getting shafted in the process. They must upgrade their hardware to get marginally better visuals?
...

This is not true. These improvements are not just for developers. Nanite gives you a geometry pipeline that can handle vastly more small triangles that would cripple a hardware rasterizer. You can have triangles that cover a single pixel. It also drastically reduces visible pop-in from LOD changes. Those are fidelity increasing improvements that perform better than a hardware rasterizer that loses more and more performance the fewer pixels the triangles cover. So that's a performance improvement and a fidelity improvement. It's slower than a HW rasterizer only if you reduce the geometry to a point where the HW doesn't slow down, which requires lower geometry and an LOD system that will lead to pop-in. Nanite currently a one of a kind software rasterizer in production. I'm not aware of any others.

Lumen is dynamic real-time GI system. There could be a better trade-off if a game is a static environment, but for games with day to night transitions, or other dynamic elements it won't break down. It also works with nanite, which I'm not sure other GI solutions do.

Virtual shadow maps is the same deal. It works with nanite, and it provides a universal shadowing system of good quality. I also handles dynamic environments. My experience with it in fortnite was that it massively reduced shadow pop-in. In fortnite I always turned shadows off because the pop-in was horrible, but with virtual shadow maps it's way more stable and visually pleasing.

Yah, these things are expensive, but they increase fidelity and solve real technical problems. That's how graphics has always worked. The industry comes up with better ways to approximate physical environments, and they tend to cost more than the previous ones, but the hardware gets better too. There are always points where the software side gets ahead of the hardware side. I don't think that's actually happening here. If anything long-standing problems are being solves. UE4 lasted almost ten years. UE5 will likely be 5-10 years. These technologies are built for the future as much as for current gen, and they will only improve.
 
This is not true. These improvements are not just for developers. Nanite gives you a geometry pipeline that can handle vastly more small triangles that would cripple a hardware rasterizer. You can have triangles that cover a single pixel. It also drastically reduces visible pop-in from LOD changes. Those are fidelity increasing improvements that perform better than a hardware rasterizer that loses more and more performance the fewer pixels the triangles cover. So that's a performance improvement and a fidelity improvement. It's slower than a HW rasterizer only if you reduce the geometry to a point where the HW doesn't slow down, which requires lower geometry and an LOD system that will lead to pop-in. Nanite currently a one of a kind software rasterizer in production. I'm not aware of any others.

Lumen is dynamic real-time GI system. There could be a better trade-off if a game is a static environment, but for games with day to night transitions, or other dynamic elements it won't break down. It also works with nanite, which I'm not sure other GI solutions do.

Virtual shadow maps is the same deal. It works with nanite, and it provides a universal shadowing system of good quality. I also handles dynamic environments. My experience with it in fortnite was that it massively reduced shadow pop-in. In fortnite I always turned shadows off because the pop-in was horrible, but with virtual shadow maps it's way more stable and visually pleasing.

Yah, these things are expensive, but they increase fidelity and solve real technical problems. That's how graphics has always worked. The industry comes up with better ways to approximate physical environments, and they tend to cost more than the previous ones, but the hardware gets better too. There are always points where the software side gets ahead of the hardware side. I don't think that's actually happening here. If anything long-standing problems are being solves. UE4 lasted almost ten years. UE5 will likely be 5-10 years. These technologies are built for the future as much as for current gen, and they will only improve.
I'm very much aware of the benefits of these features but, outside of this forum, the average user doesn't know and more importantly, doesn't care. It's unwise to use the keen eye for detail displayed on this forum and extrapolate it to the majority of people playing games. This is a niche of a niche and the opinions expressed on here are not remotely representative of the general audience. We even have popular tech-tubers like Austin Evans telling millions of people that SSR is raytracing smh.

Even in this niche, we argue and debate the these features with strong resistance to either side. I mean look at nanite, it's great in theory. In practice, one would have to argue the relevance of such a pipeline for games. Movies sure, but games? Hyperbole here but, where's the dollar store version of nanite? You know, the one that doesn't offer as much granularity but also costs significantly less? That is the problem with UE5's new features. They're great in theory but, they have absolutely no care in the world for performance. I mean I just watched the latest DF direct where they talked about the interview with the immortals of aveum devs. The devs discussed having to swap out ue5's tsr for FSR2 because it performed better and had less ghosting artifacts. They discussed writing their own shader precompiling process because the one in ue5 was not performant. Again, if I have a 4090, a $1600+ USD gpu, so one mortgage payment, I should be able to get decent performance in a UE5 game. Software does not exist in a vacuum. The available hardware must be taken into account when introducing new features and optimizing said features. If those of us with 4090's are struggling to run the game, it might as well not exist. It's not like we're seeing a crysis level step up in visuals to justify this woeful performance. Even Fortnite with UE5 features runs like rubbish.
 
Immortals of Aveuym runs at 70fps on a 4090, using 4K DLSS Performance. This is not acceptable.

Worse yet, the performance evaluation tool bugs out and assigns 25 points from the CPU bugdet to each level of AF. So 16X AF costs about 100 points which is half the CPU budget on powerful 7800X3D.

 
I wouldn't consider Immortals of Aveum an example of upscaling being used as a crutch. The game is clearly on the cutting-edge of graphical features and it does translate to the actual visuals on screen.
 
So far, UE5’s pitch hasn’t been about the end user, it’s been about saving developers/publishers money. The end user is getting shafted in the process. They must upgrade their hardware to get marginally better visuals?
Or they keep having developers/publishers trying to make a product without ballooning the costs even furter.
 
Immortals of Aveuym runs at 70fps on a 4090, using 4K DLSS Performance. This is not acceptable.

Worse yet, the performance evaluation tool bugs out and assigns 25 points from the CPU bugdet to each level of AF. So 16X AF costs about 100 points which is half the CPU budget on powerful 7800X3D.

I am curious if these Reports rom WCCFTech are with or without the Developer day 0 Patch or The reviewer Manual hot fix. That fix dramatically changes the game's frame-times in DF's experience.

As for 16x AF costing 100 Points in their pre-release version of that point allocation system, I asked the devs in the Interview, to say the least, that number is not correct. Apparently None of the Numbers in that Point allocation system were really finalised pre-launch. I have yet to Look in the launch .exe to see If they updated the values.
 
Back
Top