Upscaling Technology Has Become A Crutch

BitByte

Regular
From the inception of DLSS, FSR, and Xess, I’ve been adamant that they’re nothing but crutches. Theoretically speaking, these technologies are supposed to allow us to intelligently render graphics and better utilize the GPU. In actuality, it just encourages an extreme lack of optimization and the proliferation of technical incompetence. There are a host of games that are guilty of this but, the latest offender is Remnant 2. A game so average looking yet somehow, the 4090 cannot render it at 4k60 without upscaling. At what point is this going to stop? On consoles, it’s can barely hold 60fps at 720p-ish resolutions? This trend cannot continue as we cannot keep upgrading to cope with these levels of incompetence. I know I’ll catch some flack for calling it incompetence but it is what it is. There are still some studios that have a sense of pride and craftsmanship in their work but they are like a relic of yesteryear.

Honestly, I don’t know how a studio can release work like this and be proud. It’s just awful.
 
This trend cannot continue as we cannot keep upgrading to cope with these levels of incompetence.
Some people still refuse to acknowledge this, but the gaming scene is full of this level of incompetence now. Don't take my word for it though, let's hear it from an actual veteran game developer.

"I believe there's a systems programming crisis in game development where most of the people writing code have no idea what the hardware it's running on is like, how it operates, or the operating system / kernel fundamentals at a systems level"

 
How many UE5 developers are actually writing low-level graphics code? I don't believe they're all that many and especially not on the indie side.

Besides, 4090 cards aren't exactly maxing out Fortnite at 4k60 without upscaling either. We already knew that UE5 was GPU intensive when using its more advanced features.
 
From the inception of DLSS, FSR, and Xess, I’ve been adamant that they’re nothing but crutches.
From the inception, that sort of claim has always been absurd.

Advanced reconstruction techniques like these are some of the best things to happen to gaming tech in a while. It's a *nearly* free giant boost to performance/overhead. It makes playing at 4k far more feasible than having to run everything natively. And hell, DLSS itself often does a better job of anti-aliasing than native 4k w/TAA, which is very welcome as fine detail in games keeps going up.

And this idea that developers would all just stop optimizing their games and rely on this instead is the EXACT SAME rhetoric we see nearly every time a new generation of GPUs come out and some myopic people claim that developers will stop optimizing their games and just tell everybody to buy new GPU's(with many even going as far as saying it's a big conspiracy and developers are being paid to do this!).

In the end, it's all basically just thinly disguised 'lazy developer' rhetoric, which I cant stand given how immensely hard developers work to make these games as good as they possibly can for us. 'Crunch' wouldn't be such a widely debated topic in the games industry if developers weren't working as hard as they can. And games wouldn't take so much longer to make than ever if developers were all just worried about rushing out games as fast as possible with no care put into them.

I've maintained from pretty early on that reconstruction would become standardized in games. It's almost a waste to NOT use it when you're aiming for a 4k presentation. We can already see this happening on consoles, where reconstruction is being used as default in many demanding games, and it was inevitable that developers would expect users on PC to similarly use it as default. Even saw a poll somewhere that more than 50% of RTX owners use DLSS when available in their games. Not scientific evidence of anything, but a good indication that PC users do think reconstruction(which will only keep getting better) is very much worthwhile.

Obviously it's possible there will be examples of games without good optimization that might lean on reconstruction, but games without good optimization have always existed. Again, we've heard the same arguments many times in the past that developers would recommend using a newer/more powerful GPU in order to make up for poor optimization. It's the SAME rhetoric all over again. And it's never really been true as a whole. Some isolated incidents dont prove it encourages poor optimization anymore than Nvidia releasing some new GPU's does.
 
Last edited:
Isn't that what crutches are? Crutches were invented to enable people to move around.
OP is basically arguing reconstruction techniques are crutches that help people who just dont want to learn to walk normally get around, not people with actual disabilities, though.

Cuz yea, in real life, crutches are a good thing that some people actually need.
 
I would argue that up-scaling became a necessity a long long time ago. Imagine the PS4 library without checker boarding, or the Xbox One without dynamic resolution. Think of how much worse the games would look. We'd have mostly been playing games that looked like higher resolution PS360 games. Without upscaling ray tracing would not be viable at all, nor would UE5's Lumen. We're probably a decade past brute-forcing better graphics purely with better hardware. Enjoy buying a $2K graphics card so you can play marginally better (or worse depending on who you ask) "native" rendering. The idea of designing a game around upscaling is pretty much every single big budget console game, even the ones people praise for their graphics.
 
Decided to take a quick look at fortnite. I play fortnite with everything on low (including textures) except view distance epic, nanite geometry on, and DLSS Quality at 1440p. I have an RTX 3080 10GB. Landed in a quiet spot and was getting 170 fps at that particular view. Turned on virtual shadow maps (medium is the lowest setting) and my fps dropped to 120. Changed virtual shadow maps to ultra and my fps dropped to 90. Then I switched to TAA with 100% 3D resolution and my fps dropped to 65. And that's with textures, effects, post processing etc all turned to low. That was landing in a quiet area on a small hill overlooking some trees.

So is Remnant 2 super unoptimized when using similar technology?

Edit: I really think that unless you can see the problems they solve, nanite and virtual shadow maps are pretty expensive.

Here's a vid of some guy who is absolutely blind comparing it to Killzone 3 ... he even includes a Killzone 3 screen, like it doesn't immediately show how massively better Remnant 2 looks LOL

 
Last edited:
In actuality, it just encourages an extreme lack of optimization and the proliferation of technical incompetence. There are a host of games that are guilty of this but, the latest offender is Remnant 2. A game so average looking yet somehow, the 4090 cannot render it at 4k60 without upscaling.
Your argument is non-sequitur. It's not upscaling that's the problem that you are calling out. Assuming it's true and graphics these days are substandard based on what's possible, the upscaling isn't causing it. Upscaling would be enabling even better than present. So taking your example of Remnant 2 that you consider average looking but a 4090 can't render it 4k60 without upscaling...remove the upscaling and now you have an average looking game that's running lower res and framerate! Now let's say we remove the upscaling and devs get off their fat, lazy behinds and actually do some work and write good code that gets an average looking game at 4k60 maxing out the 4090. Now if you want it to look better than average, you need more juice, that can only come from freeing up that being used which would come from rendering lower res, more Pretties enabled, and upscaling.

The problem here is the engines and workloads. To identify the limitations would need a good look at the hardware utilisation and budgets and building a good model of development costs and possibilities. Blaming a single tick-box feature doesn't do that; one could just as readily blame SSAO or deferred rendering as a blanket finger-pointing exercise as you've done.

Putting on my mod hat, I'm not sure I should keep this thread open.
It's not a useful discussion, just a wide-angle bitch. An actual discussion would have to look at modern gamedev and determining what actually is holding it back, if anything. I don't think anyone can do that without being inside the development of multiple AAA games, and an armchair "games don't look as good as I'd like" perspective has nothing to contribute other than feelings.
 
Your argument is non-sequitur. It's not upscaling that's the problem that you are calling out. Assuming it's true and graphics these days are substandard based on what's possible, the upscaling isn't causing it. Upscaling would be enabling even better than present. So taking your example of Remnant 2 that you consider average looking but a 4090 can't render it 4k60 without upscaling...remove the upscaling and now you have an average looking game that's running lower res and framerate! Now let's say we remove the upscaling and devs get off their fat, lazy behinds and actually do some work and write good code that gets an average looking game at 4k60 maxing out the 4090. Now if you want it to look better than average, you need more juice, that can only come from freeing up that being used which would come from rendering lower res, more Pretties enabled, and upscaling.

The problem here is the engines and workloads. To identify the limitations would need a good look at the hardware utilisation and budgets and building a good model of development costs and possibilities. Blaming a single tick-box feature doesn't do that; one could just as readily blame SSAO or deferred rendering as a blanket finger-pointing exercise as you've done.

Putting on my mod hat, I'm not sure I should keep this thread open. It's not a useful discussion, just a wide-angle bitch. An actual discussion would have to look at modern gamedev and determining what actually is holding it back, if anything. I don't think anyone can do that without being inside the development of multiple AAA games, and an armchair "games don't look as good as I'd like" perspective has nothing to contribute other than feelings.
It’s a bad faith argument imo. There’s no limit to how far back we can go. All AA should have been SSAA etc. all textures and assets should never be compressed for maximum fidelity.

There’s really no limit here imo and it comes across as some awkward form of elitism.
 
Upscaling being a crutch is a good thing. It's such a good crutch that it allows all the other crutches that is rasterization perform a bit better. I wouldn't complain about this easy win.

Game development these days isn't easy in the AAA space. Inexperienced programmers new to game development may be the easiest to point a finger at but the problem is likely farther rooted in the dev environment as a whole. We could go on and on what these issues are but it would warrant a new thread. I tend to blame the management side of things more for the current state of games.
 
Your argument is non-sequitur. It's not upscaling that's the problem that you are calling out. Assuming it's true and graphics these days are substandard based on what's possible, the upscaling isn't causing it. Upscaling would be enabling even better than present. So taking your example of Remnant 2 that you consider average looking but a 4090 can't render it 4k60 without upscaling...remove the upscaling and now you have an average looking game that's running lower res and framerate! Now let's say we remove the upscaling and devs get off their fat, lazy behinds and actually do some work and write good code that gets an average looking game at 4k60 maxing out the 4090. Now if you want it to look better than average, you need more juice, that can only come from freeing up that being used which would come from rendering lower res, more Pretties enabled, and upscaling.

The problem here is the engines and workloads. To identify the limitations would need a good look at the hardware utilisation and budgets and building a good model of development costs and possibilities. Blaming a single tick-box feature doesn't do that; one could just as readily blame SSAO or deferred rendering as a blanket finger-pointing exercise as you've done.

Putting on my mod hat, I'm not sure I should keep this thread open. It's not a useful discussion, just a wide-angle bitch. An actual discussion would have to look at modern gamedev and determining what actually is holding it back, if anything. I don't think anyone can do that without being inside the development of multiple AAA games, and an armchair "games don't look as good as I'd like" perspective has nothing to contribute other than feelings.
The devs of Remnant 2 literally said the game was designed to be used with Upscaling. Their words, not mine. Also the argument isn’t non-sequitur because if it didn’t exist, they wouldn’t ship the game in this state period. Why, because it would be completely panned.

Blaming the engine is a complete red herring because the devs are responsible for picking the engine and features in the engine they use. If an engine has massive bottlenecks, then it’s the devs jobs to identify the proper engine for the job prior to ramping up production. This is not a UE5 problem and it spans multiple engines. In the past, when these techniques were not available, devs were forced to be more conscious about their performance and take the appropriate action. Now we have a lack of craftsmanship, a lack of technical competence and the works. Let’s just make a game and just hope FSR2 or DLSS or other reconstruction techniques save us from our poor decision making and lack of optimization. I’m simply pointing out what many have noticed….

DICE delivered BF3 on the ps360 and other studios delivered “impossible” experiences on very poor hardware. We’re nearly 2 decades removed from the launch of those systems with a magnitude of difference in the performance of cutting edge hardware. Yet somehow the devs of Remnant 2 want to tell me this average looking game can’t run at 4k60 on a $2000+ 4090? Or Jedi survivor can’t run the game at a stable frame rate because they fixed absolutely non of the cpu issues from the last game? The list of woefully performing games continues to grow while delivering marginal improvements in graphics. I mean, the annoyance is not unfounded.
 
Last edited:
@BitByte This is a "AA" $50 game from a small studio. This isn't the same as Respawn laying an egg releasing a Star Wars game with obvious issues. Now, that doesn't excuse severe problems. If you're selling a product it had better be quality acceptable for its price. But there are differences in the lengths a smaller budget studio can go in altering an engine, or creating a bespoke one for their needs.

Can you actually prove that this isn't baseline UE5 performance? I tested out fornite on an RTX 3080 10GB with 100% 3D resolution w/TAA and all settings low except nanite on, draw distance epic and virtual shadow maps epic I was getting about 65 fps in a very simple area on a hill overlooking some trees. That was effects low, post processing low, textures low. I'd guess I'd probably be in the 40s if I had everything maxed. Does that seem vastly out of line with Remnant 2 ultra settings at 1440p? I could go back and check. I remember the area. Do you actually know that Remnant 2 is poorly optimized, or are you making assumptions about UE5 nanite and virtual shadow maps? I'd be really interested in seeing some substantiation that there are a lot of easy performance wins to be had.
 
@BitByte This is a "AA" $50 game from a small studio. This isn't the same as Respawn laying an egg releasing a Star Wars game with obvious issues. Now, that doesn't excuse severe problems. If you're selling a product it had better be quality acceptable for its price. But there are differences in the lengths a smaller budget studio can go in altering an engine, or creating a bespoke one for their needs.

Can you actually prove that this isn't baseline UE5 performance? I tested out fornite on an RTX 3080 10GB with 100% 3D resolution w/TAA and all settings low except nanite on, draw distance epic and virtual shadow maps epic I was getting about 65 fps in a very simple area on a hill overlooking some trees. That was effects low, post processing low, textures low. I'd guess I'd probably be in the 40s if I had everything maxed. Does that seem vastly out of line with Remnant 2 ultra settings at 1440p? I could go back and check. I remember the area. Do you actually know that Remnant 2 is poorly optimized, or are you making assumptions about UE5 nanite and virtual shadow maps? I'd be really interested in seeing some substantiation that there are a lot of easy performance wins to be had.
See the problem is it doesn’t matter if it’s baseline ue5 performance. They chose to use nanite and virtual shadow maps. Rewinding back further, they chose to use UE5 in the first place. That’s a choice they were free to make but if I look at the games graphics and compare it to its performance, I consider it to be very poor. To me, that is unacceptable and the to top it off, I’m asked to use reconstruction techniques to play an average looking game at 1440p on a $2000+ GPU. Switching gears to console for a moment, imagine if reconstruction wasn’t available on consoles? What ungodly resolutions would console players have to endure?

At the end of the day, it’s a game and you have to make smart trade offs. If you’re going to deliver this level of performance on a 4090, the graphics must be revolutionary and Remnant 2 is not that….
 
Aren't the issues specific to Remnant 2 more the choice of Nanite and virtual shadow maps? If they hadn't picked those, they could run a lot faster. What has upscaling got to do with that choice? I guess you're saying the existence of upscaling enabled them to be able to make that choice because without, framerate would be too low. Well, if they then gave up on nanite and used different tech and got a better framerate, they could then add in reconstruction and make the results even better!

Upscaling isn't the deciding factor here. Probably it was a desire to use a latest-tech for promotional purposes - the first non-Epic Nanite game, right? It's getting lots of coverage as a result.

It basically comes down to good games and not so good games. Other devs putting in more effort will out-compete on visuals and get the sales. That's how this business has always operated. If consumers chooses to buy lower-effort creations, that's on them. If I were making a AAA game, I'd be intending to use upscaling. I'd crush the GPU and get every ounce of performance from it and create the best looking game ever. I wouldn't waste performance getting an extra 10% pixel fidelity from rendering native.

Do you think Marbles RTX would look and run better without upscaling?
 
Last edited:
See the problem is it doesn’t matter if it’s baseline ue5 performance. They chose to use nanite and virtual shadow maps. Rewinding back further, they chose to use UE5 in the first place. That’s a choice they were free to make but if I look at the games graphics and compare it to its performance, I consider it to be very poor. To me, that is unacceptable and the to top it off, I’m asked to use reconstruction techniques to play an average looking game at 1440p on a $2000+ GPU. Switching gears to console for a moment, imagine if reconstruction wasn’t available on consoles? What ungodly resolutions would console players have to endure?

At the end of the day, it’s a game and you have to make smart trade offs. If you’re going to deliver this level of performance on a 4090, the graphics must be revolutionary and Remnant 2 is not that….
I’m curious, what would you classify as revolutionary graphics? Anyone can chime in too
 
Remnants 2 is odd because it hugely benefits from scaling. Just going from native 4K to 4K/DLSS Quality boosts performance by like 60% which I don’t think I’ve ever seen. I generally see it in the range of 20-30%.
 
Back
Top