Upscaling Technology Has Become A Crutch

This was a conversation here some years back when pixel counting was vogue. 'Native Res' in the end only referred to 'Opaque Geometry'.

I don't know that I fully get BitByte's argument, but I think it basically comes down to, "if upscaling didn't exist, this game would be so bad it wouldn't be released. Upscaling has dug the developers out of a hole and does so for a lot of devs who should otherwise crash-and-burn to bring game QA up to yesteryear standards."

The problem is the economics of game development. The solution at present isn't to solve those economics but to render lower and lower res and just upscale it to something stomachable.
Exactly. Thank you.
 
Look at this "next gen graphics" from UE5. Like seriously, there are PS4 games that look better.
What's that saying "If my grandmother had wheels, she'd be a bike"

Let’s be mindful of what you're criticizing here.

A) Remnant 2 - indie studio that has 3 price points for their game $50-$80 dollars (announced Dec 2022 - released July 2023 - studio size of 63 People)
B) immortals of Aveum - The first release of a completely new studio (founded in 2018 - released Aug 2023 - studio size of 100 People)
C) Fort Solis - another indie studio first major release (a team of 10-15 People completed development in MONTHS)

UE5 official release year was 2022!
Expecting them to provide next Gen visual experiences when they never claimed they would with UE5. It's unfair to them to hold them to that standard.

What all of these studios have accomplished with a fraction of the resources and time frame big budget teams have is incredible. They are pumping out incredible graphics at a fraction of the price other studios would have paid significantly more for if you go back to the UE4 days.

Feel free to talk shit about UE5 if Coalition or Ninja Theory bomb their UE5 titles to have middling graphics. But cherry picking on these guys without taking note how little budget and manpower they had to create them, I don't think that argument works in your favour.
 
Last edited:
Awesome strawman from you. OP has never said that all devs are lazy and incompetent. He said that it has become a crutch which it most definitely has given the latest examples. It doesn't mean that everyone has suddenly turned incompetent and that we will no longer see well-optimized games, it simply means that many developers will now use this as an easy way out instead of properly optimizing their games.

As for it being downright wasteful not to use it, no. It still isn't perfect and has issues that not everyone finds acceptable over native resolution. I'm not one of those people but native resolution should still be an option until upscaling surpasses it in every metric and effectively makes it obsolete.
It is not a strawman whatsoever. Implying that devs are using it as a crutch directly implies that they think developers are purposefully doing a worse job and just letting reconstruction take care of the rest. Just because they dont state this more directly doesn't mean that isn't exactly what they're saying in the end.

And no, it is basically impossible to prove that any developer is using it as a crutch, even if it's not impossible that a few might. A game simply not being super well optimized is not 'proof' of this, as again, such games have always existed. The idea that 'many developers will now use this as an easy way out' is a completely baseless claim. No matter how good tech gets, certain developers will do better and others worse at utilizing that tech. Nothing has changed here. It's not more used as a crutch than a new generation of GPU releases can be accused of the same thing.

As for it 'not being perfect', almost nothing is. This is a poor argument against it. And it's especially strange after so many years of 'framerate is king' claims and now suddenly we're supposed to pretend that a technique that grants potentially huge amounts of performance overhead is now suddenly not worth it because it might have some minor imperfections at times? If a setting for clouds gains you 40% performance for a very minor degradation in visuals, almost everybody would recommend making that change.

And nobody is forcing anybody to use reconstruction unless you're on console. You still have the option to use native, you'll just have to pay for it, meaning you might well need a GPU that's like $300-400+ more expensive to play at the same settings/framerates. I think this reality alone proves how useful and worthwhile reconstruction is. The existence of some games not being well optimized changes nothing about this reality. Again, such games are nothing new.
 
The premise of the thread is fairly straightforward; by their looks alone, those games should run better,
That's a fine premise on a game by game basis, but it really has nothing to do with upscaling. I imagine folks here really just aren't familiar with how expensive dynamic lighting is and admittedly outside Fortnite (where I think most will agree the visual upgrade is pretty obvious because it was always dynamic lighting), the games so far don't lean into the dynamic nature very much with time of day or similar systems. Thus people are comparing the look to games with baked lighting, which is obviously far cheaper.

Could these specific games be done with baked lighting? Possibly, with some sacrifices in both lighting and geometric detail. And of course longer development times and less content. But that's certainly not true of all games.

it basically comes down to, "if upscaling didn't exist, this game would be so bad it wouldn't be released."
Right but that argument is silly on the face of it. When Battlefield 4 was released no GPU could run it at 120fps at 4k, or even 60fps at 1440p. The very notion that a GPU has to run it at some arbitrarily-chosen set of settings just because that's what you're used to running last gen games at because they were targeted to GPUs with 10x less power doesn't make sense. Not to belabor the point, but was half res particles a crutch or an optimization? Were any of the myriad of clever approximations (which in the end is all that real time graphics is) crutches or "optimizations"? It's fine to discuss individual choices in detail, but the global notion that a game has to run at X performance at Y resolution or else it's "poorly optimized" is silly. We've done reconstruction for ages - it has just gotten more clever recently.

Ultimately, these games generally *do* run fine on mid-to-high end machines... and I can say that confidently because (unlike I presume many in this thread) I've actually played them. They are more expensive per pixel than games of the past, but they also look better than past dynamically-lit games.

Now as I mentioned above, whether or not people care about dynamic lighting is a separate discussion, and there's no disputing that it is expensive. Similar argument for the higher geometric detail brought by Nanite. Both have some good moments but aren't necessarily pushed as hard overall as they could be in the games discussed here.

Anyways I think I've said my bit at this point. I'd just prefer to steer the chat to specific games and techniques rather than people just making grandiose complaints about arbitrary metrics.
 
Last edited:
I fully agree with the point that Remnant 2, Immortals of Aveum and Fort Solis don't look even remotely as good as any of the previous UE5 demos to justify their ridiculously high GPU and CPU requirements, some of them don't even support upscaling properly.

The problem right now seems that UE5 is again becoming CPU limited, and that the Software Lumen solution is as expensive as Hardware RT without giving the visual benefits of hardware RT solutions (like in Metro Exodus, Dying Light 2, Witcher 3) Epic and the developers need to optimize the CPU portion of the engine better, and Epic needs to step up their game with Hardware Lumen to provide faster performance and better quality Lumen (for global illumination and reflections).

the settings of Immortals of Aveum scale poorly and even dropping them to the lowest doesn't claw back that much performance
A clear sign of CPU limited code.
 
A clear sign of CPU limited code.
In this case I think it's more that reviewers are still just not used to the performance depending much more heavily on the pixel count. There are settings that heavily affect performance: resolution and upscaling modes. It's because they affect the cost of most of the rendering pipeline now, as they should. In the past things like shadow quality and geometric quality affected performance a lot because it was basically just a separate resolution setting. In UE5 both shadow quality and geometry quality are directly tied to the primary rasterization/shading rates, not their own settings... again, as they should be.

There may of course be CPU limited parts too, but that's easy enough for reviewers to test now I think. In the reviews linked in this thread there was still heavy dependence on resolution/upscaling settings (performance, balanced, quality, etc) which leads me to believe it's still pretty GPU limited in the cases they are testing.
 
Last edited:
Right but that argument is silly on the face of it. When Battlefield 4 was released no GPU could run it at 120fps at 4k, or even 60fps at 1440p. The very notion that a GPU has to run it at some arbitrarily-chosen set of settings just because that's what you're used to running last gen games at because they were targeted to GPUs with 10x less power doesn't make sense.
He is not arguing that, he is arguing that the visual upgrade is not worth the huge performance cost.

This is Immortals of Aveum at native 4K, 43fps for a 4090, that's laughable, what is this game doing that is different than Metro Exodus, Dying Light 2, Witcher 3, Cyberpunk 2077 (regular RT modes). All of these games run faster at native 4K on the 4090, while simultaneously rendering several ray traced effects at the same time: reflections, shadows, global illumination ..etc. Fort Solis also runs at 45fps on the 4090 at native 4K, so similar performance, while doing substantially less than the aforementioned games.

performance-3840-2160.png



There may of course be CPU limited parts too, but that's easy enough for reviewers to test now I think
In the footage of Immortals game, there was many many scenes where the 4090 was at 50% utilization on a 7800X3D CPU!
 
He is not arguing that, he is arguing that the visual upgrade is not worth the huge performance cost.
Great, I don't even think I'd disagree with that argument. But then people go ahead and follow such statements up immediately with stuff like...

This is Immortals of Aveum at native 4K, 43fps for a 4090, that's laughable
I think we're just spiraling at this point but no, the notion that "native 4k" is a meaningful thing to measure is what is laughable. This is precisely the sort of statement that is just becoming willfully ignorant at this point. Is there literally not even a single test of the default upscaling modes in that review? People need to go rewatch @Dictator's discussion I guess...

what is this game doing that is different than Metro Exodus, Dying Light 2, Witcher 3, Cyberpunk 2077 (regular RT modes)
If you ran any of those games with geometry and shadows that were 1:1 with the pixels at 4k I guarantee they would run like shit. But they don't even give you the option. The best you can get is ~2k shadow maps and 100+ pixel polygons. If you're lucky you get ray traced shadows but then you're guaranteed to have relatively low detail geometry because GPUs can't currently handle detailed BVHs, especially for anything dynamic. The only thing you get at "4k" is the textures. It's just not a meaningful comparison.

Now it's of course totally fine to compare the subjective quality of the graphics in these games relative to the performance, even against games that use partially or fully baked lighting with appropriate caveats. But let's stop with the silly soapbox nonsense, especially on Beyond3D. There's a reason why I come here instead of other forums...

In the footage of Immortals game, there was many many scenes where the 4090 was at 50% utilization on a 7800X3D CPU!
Didn't see the link in those reviews, but was this 50% utilization and the game was at like 40fps or something, or was the game still running quickly, just it was a relatively easy GPU frame (lots of sky, etc)? I'm running on a weaker CPU than that and I don't really get drops below 80fps or so, which is completely fine for this game. If it's CPU limited at a decent frame rate, that just means you can increase the sampling rate to get closer to 4k NATIVE, right? :p
 
Last edited:
Not to discredit the work these developers are doing.. but I think we need to hold off until we see some examples from studios who we know, and trust. None of these games are from larger studios.. and so it's expected that things might not be as optimal as they could be. I mean yes, it's becoming quite clear that fully utilizing all the advanced features of UE5 are going to be very taxing, especially as you scale up resolution, as we know.. CPU will be the limiting factor, and we may not see the scalability we want for a while yet.. but people really need to stop looking at Unreal Engine 5 as something that will only exist for the tech of today... No.. this is quite clearly a massive bet on the future.. one designed around allowing developers to really go full out as well as streamline a lot of their workflows.

There will be developers who art style or visuals may not see a massive benefit, and it may not be in their best interest to use the features of UE5... but for sure there will be developers out there who will produce things which just simply wouldn't be possible at the same fidelity and stability using traditional methods.

For example, I'm looking forward to seeing what The Coalition can do with UE5.. and I think we'll have a bit better of an idea once these bigger studios show what can really be done with it. And another thing.. there will be, as always, a group of people who "can't see any difference" and "this isn't worth the cost"... Look at when Nvidia unveiled their hardware RT acceleration... So many people claimed it was just a gimmick... and I know a lot of those people now who are hyping up the RT modes and understand what it does, and can do.

The same thing will happen here. It will simply take some time.. and as more examples come of the tech being well utilized come about, people's tunes will start to change.
 
I think we're just spiraling at this point but no, the notion that "native 4k" is a meaningful thing to measure is what is laughable.
I totally agree with you on this, native 4K is meaningless, especially in the presence of competent upscalers, however, native sets out the baseline. Native 1440p is relevant because it gives you an idea about the performance of 1440p to 4K upscaling, native 1080p gives you an approximation of 1080 to 4K upscaling. Native 4K will give us an idea about 4K to 8K upscaling (on future hardware of course). You can NEVER and should NEVER discard performance at native resolution.

If you ran any of those games with geometry and shadows that were 1:1 with the pixels at 4k I guarantee they would run like shit
I get that, UE5 is designed to scale geometry and shadows upwards with resolution, as such native 4K becomes unattainable on current hardware. However, isn't that an inefficient use of resources? These games (Fort Solis and Immortals of Aveum) are very narrow in scope, we are comparing them to fully open world games with vastly bigger worlds, better reflections and global illumination. The use of 1:1 shadows to pixels while not doing any better job at rendering shadows than the previously mentioned games is certainly not efficient. We all understand that these are different games with different tradeoffs, but some manage to arrive to a better overall look relative to performance.
 
The use of 1:1 shadows to pixels while not doing any better job at rendering shadows than the previously mentioned games is certainly not efficient.
I'd have to agree to disagree on the shadows front (perhaps predictably :p). Even on default shadow settings the quality is better than most other shadow map games. Certainly if you are running 4k native with equivalent shadow resolution they definitely look better.

Triangle raytraced shadows can definitely look better in various ways, but of course again with caveat that they only work in games with relatively less geometric detail (the current Achilles heel of raytracing unfortunately) and/or proxy geometry. Even games with raytraced shadows typically need to augment them with screen space shadows both for finer detail but - more seriously - to cover distant geometry.

We all understand that these are different games with different tradeoffs, but some manage to arrive to a better overall look relative to performance.
Sure, and I don't necessarily disagree with that part overall either, even being somewhat subjective and specific to these games. But the hyperbole around things being "unacceptable" and "unplayable" and "omg 4k native" undermines that point and the entire conversation rather than making it.
 
None of these games are from larger studios.. and so it's expected that things might not be as optimal as they could be
As I said before, the problem is these new UE5 games don't behave that differently from running Fortnite at native 4K, here is Fortnite delivering 45fps in busy scenes on a 4090 using max settings and native 4K. Fortnite is of course the best optimized UE5 title we have right now.


Here is the newly released STALKER 2 gameplay trailer, most busy scenes run at low fps in the trailer, you can spot them easily. As such I expect STALKER 2 to be within that very same performance envelope.


However, don't get me wrong on this, I am encouraging everyone to expect that this is the new baseline performance of UE5 when running all of it's features, and I am not criticizing that in any shape or form by the way. People should brace themselves for that fact and adapt to it. UE5 relies on upscaling to deliver good performance. The best thing people can do is ask for access to the best quality upscalers. I now certainly want and demand that DLSS2 and DLSS3 (even DLSS3.5), be present in every UE5 title. I don't want to deal with the suboptimal FSR upscaler, or the much inferior and generic TSR upscaler. I seek the max image quality possible on UE5 titles.
 
You can not use the 4090 or a nVidia card. UE5 isnt optimized for nVidia. The 4090 is only 55% faster with Software Lumen than a 6950XT.

The problem with UE5 is that developers must still do the same work so that Lumen can work properly. Here is a scene from Immortals of Aveum in which Lumen is still calculated(?) but non existence:


In comparision this is a scene from a Portal mod dropped into RTX Remix with assets from nVidia:


UE5 is in a strange position. To slow to replace older engines (with HW RT) and not on par with optimized Pathtracing to archive the same image quality. When i have to use upscaling on my 4090 because the engine isnt optimized for nVidia and techniques are not fast in generell, why should i accept only medicore results?! I get 80+ FPS in Portal RTX with DLSS Quality and ~100FPS in Immortals of Aveum.

And games like Immortals of Aveum do not help. Immortals of Aveum is a PR move by EA. There doesnt exists any love here. Drop the assets into UE 5.1, activate Lumen and call it a day. Worst offender ist this (maybe the game doesnt not render properly on my 4090):


Control:
 
I tend to adjust my game settings until I can get 120 fps, and I'll even play games on all low settings despite having a 3080 if i can get close to 240Hz and the game is a competitive multiplayer type game. All that said I've done enough reading about computer graphics to understand where the brick walls are: hardware imposed geometry limitations (hardware rasterization), memory latency and bandwidth scaling slower than compute, the cost of chip fab increasing rapidly, the cost of game development increasing rapidly, bvh building/refiting being very expensive, multithreading is still a very difficult problem etc

So what happens for all of that? You increase fidelity in smarter ways, and one of those is reusing samples over time. Instead of rendering a frame and then throwing away all of the data you generated (good data), you reuse it (as much as possible). It's inevitable that this will become the defacto standard for how real-time graphics are rendered. To be honest it already is. It started with TAA, because MSAA was no longer compatible with state of the art real-time rendering. Then people figured out you could include an upscale, and spend more of your compute to generate samples at a lower resolution. Rendering is switching from having high resolutions and having an incredibly low sample rate (under-sampled) to low resolutions and a higher sample rate ( ... still nowhere close to the sample rate for reference renders).

That's basically what UE5 is doing. You generate more samples per pixel, and in the case of shadows and lighting it's 1:1 with pixels, and then you have samples from previous frames to help upscale. It's probably not an accident that I can play Remnant 2 at DLSS Performance at 1440p and it looks reasonble enough, where pretty much any other game even DLSS Balanced is a stretch.

I hesitate to spend a lot of time generating comparisons and benchmarks because it's honestly a lot of work, I wouldn't be getting paid for it and I'd only be doing it to satisfy people that just can't accept the reality of where real-time graphics are headed ... but I might have to. I could spend a bunch of effort in fortnite just with nanite and virtual shadow maps alone, without getting into the complexity of lumen.
 
You can not use the 4090 or a nVidia card. UE5 isnt optimized for nVidia. The 4090 is only 55% faster with Software Lumen than a 6950XT.
I don't agree with this statement, Only Immortals of Aveum shows this, it's an exception.

For example, in Fort Solis, a 4090 is 2x times the performance of a 6900XT.

In Fortnite UE5 the 4090 is 2.5x times faster than 6950XT.
 
Last edited:
@DavidGraham That fortnite video is so good. When he jumps from the bus you can see all of the trees rendered to the edges of the map, including the shadows. There's some distance where they switch to sdf imposters, I think, but the transitions are handled incredibly well. You can spot the occassional place where there's some kind of pop-in, usually near the edges of the scree, but I'm not sure what that is. Overall, the whole thing is a nice demonstration of nanite geometry and vsms. You have incredibly stable geometry and shadows from high up in the air to the ground. Those trees have around 30K leaves, each made of three or four polygons, not including the trunk. You can see it all from high up and it scales beautifully the closer you get. Shadows are very high quality from up close and far away, and all of it responds as things are built and destroyed. It's super impressive.
 
The same thing will happen here. It will simply take some time.. and as more examples come of the tech being well utilized come about, people's tunes will start to change.
nVidia claims that Alan Wake 2 could run with 30 FPS in native 4K on a 4090. Only two months to the release but be able to play a game with Pathtracing with ~30FPS would be a huge archivement.
So i do not really thing that Epic and these game developers have much more time.
 
I hesitate to spend a lot of time generating comparisons and benchmarks because it's honestly a lot of work, I wouldn't be getting paid for it and I'd only be doing it to satisfy people that just can't accept the reality of where real-time graphics are headed ... but I might have to. I could spend a bunch of effort in fortnite just with nanite and virtual shadow maps alone, without getting into the complexity of lumen.
Yeah I hear you on this one. I gathered a few in Remnant and IoA and such with the original intention of posting a few examples here but I realized that it's just going to turn into a cherry-picking fest to feed the narratives. I can find terrible looking shots in any of the games people mention. Hell I can find terrible looking photographs :p Conversely I can find really nice looking shots in most games, and definitely in the new ones. But ultimately I think the bias in these conversations is always going to swing towards people just picking whatever bad looking screenshots they can find and I don't really want to contribute to that internet cynical race to the bottom.

IIRC we used to have a thread here where people posted nice shots they had gathered from games but can't seem to find it now...
 
nVidia claims that Alan Wake 2 could run with 30 FPS in native 4K on a 4090. Only two months to the release but be able to play a game with Pathtracing with ~30FPS would be a huge archivement.
So i do not really thing that Epic and these game developers have much more time.

I love Remedy games. I hope I can get some reasonable settings on my 3080 or I'm going to 😭
 
Back
Top