Upscaling Technology Has Become A Crutch

@T2098 - Great contribution. I didn't even think to check out Fort Solis. What a very mediocre looking game.

Here's Immortals of Aveum on Xbox Series X. I mean apart from the horrible TSR, the game just looks like crap. You can't be upscaling games from PS360/Ps Vita resolutions and think that reconstruction is going to save you. This game looks awful.


Look at this "next gen graphics" from UE5. Like seriously, there are PS4 games that look better.
Immortals of Aveum.jpg
 
@BitByte Do you think that screen is caused by engine limitations of UE5? Looks like a poor quality texture that's scaled incorrectly. If you watch the vid the textures on the ground around that spot do not look as pixelated. Overall I'd say the game does look like a UE4 game. Those rocks and grass don't look great at all. Texture detail doesn't look amazing. It's hard to detail because outside that one really bad spot he doesn't stop to look at anything up close, at least not from skimming this video. Overall I'd say it doesn't look like textures are very sharp.

I also can't understand why they're using FSR instead of TSR. They said TSR caused ghosting, but the FSR implementation looks very bad. When the gun thingy on the arm is animating it leaves all kinds of ghosting and shimmering.

Edit: To each his own, but this guy keeps saying the gameplay is amazing, and all I see is a game I would never want to play. The console FOV alone is making my head hurt. They really can't do 100-105 horizontal on a console?

1692853797039.png
 
Last edited:
I'm not contradicting myself at all. DLSS was the first to kick it off and got first mover advantage. It all but consolidated Nvidia's current position. Amd had nothing and still has nothing. Intel showed up 4-5 years later so obviously they're struggling to get traction. The key is XESS is just as promising as DLSS and works on all GPUs. The only reason Nvidia GPUs appear mandatory is simply because DLSS doesn't run on other GPUs. There's no technical reason for it, it's just about money.

Again I don't understand your point. What do you mean by "NVIDIA's current position"? If you mean by NVIDIA GPU's current market domination, then I can't imagine that's due to DLSS. Most people probably didn't decide to buy a NVIDIA GPU instead of an AMD one simply because "DLSS is only available on NVIDIA GPU."
If what you mean is the prevalence of DLSS then again I don't think that's a big deal. As I mentioned before, upscaling mainly helps with the scenarios where the game is limited by fill rate. So if you don't want to use it, you can easily turn down the resolution. If a game's performance is so bad that turning down the resolution doesn't help, then DLSS won't help either. So it's not the "fault" of upscaling tech. Upscaling tech is more like that if a game only runs well in 1080p, then you can use it to make it looks good on a 4K display, and that's it.
If your point is that NVIDIA should made DLSS available to all then it's all good, then all I can say is that of course it's better for the consumers but I don't think NVIDIA "have" to do that nor is it's more ethical or something. It's business, and of course it's about money.
 
However, pan the camera down, and the contribution of the red light to the lighting of the floor grating disappears entirely, and it goes very dark, almost as if it's entirely in shadow, while the floor grating to the right of it does not.
Likely the red "light" isn't a light at all, and instead is just an emissive object. This will both tend to produce more noise, and also if the objects are too small they can't be captured very well in some of the Lumen data structures. This is less an issue of RT or not and more an issue of pushing the system too far on the emissive front. This is discussed in the UE documentation and several other sources online; there needs to be a good balance of lighting from analytic sources even if some (ideally larger) emissive objects can contribute as well.

https://docs.unrealengine.com/5.0/en-US/lumen-technical-details-in-unreal-engine/ (search for the various Emissive comments on there)
 
@T2098 - Great contribution. I didn't even think to check out Fort Solis. What a very mediocre looking game.

Here's Immortals of Aveum on Xbox Series X. I mean apart from the horrible TSR, the game just looks like crap. You can't be upscaling games from PS360/Ps Vita resolutions and think that reconstruction is going to save you. This game looks awful.


Look at this "next gen graphics" from UE5. Like seriously, there are PS4 games that look better.
View attachment 9445
This looks like a bug. Hard to imagine it could be the intended result.
 
Do we know what internal resolution the consoles are running at? I saw a suggestion from a YT video yesterday that it was "around 1800p most of the time" (FSR'd up to 4K) which I find laughably unrealistic.
 
@BitByte Do you think that screen is caused by engine limitations of UE5? Looks like a poor quality texture that's scaled incorrectly. If you watch the vid the textures on the ground around that spot do not look as pixelated. Overall I'd say the game does look like a UE4 game. Those rocks and grass don't look great at all. Texture detail doesn't look amazing. It's hard to detail because outside that one really bad spot he doesn't stop to look at anything up close, at least not from skimming this video. Overall I'd say it doesn't look like textures are very sharp.

I also can't understand why they're using FSR instead of TSR. They said TSR caused ghosting, but the FSR implementation looks very bad. When the gun thingy on the arm is animating it leaves all kinds of ghosting and shimmering.

Edit: To each his own, but this guy keeps saying the gameplay is amazing, and all I see is a game I would never want to play. The console FOV alone is making my head hurt. They really can't do 100-105 horizontal on a console?

View attachment 9446
I don't think it's caused by the engine. I was just making a mockery of the devs who said prior to release, "we want to push the graphics" in reference to why the spec requirements were so high.
 
Again I don't understand your point. What do you mean by "NVIDIA's current position"? If you mean by NVIDIA GPU's current market domination, then I can't imagine that's due to DLSS. Most people probably didn't decide to buy a NVIDIA GPU instead of an AMD one simply because "DLSS is only available on NVIDIA GPU."
If what you mean is the prevalence of DLSS then again I don't think that's a big deal. As I mentioned before, upscaling mainly helps with the scenarios where the game is limited by fill rate. So if you don't want to use it, you can easily turn down the resolution. If a game's performance is so bad that turning down the resolution doesn't help, then DLSS won't help either. So it's not the "fault" of upscaling tech. Upscaling tech is more like that if a game only runs well in 1080p, then you can use it to make it looks good on a 4K display, and that's it.
If your point is that NVIDIA should made DLSS available to all then it's all good, then all I can say is that of course it's better for the consumers but I don't think NVIDIA "have" to do that nor is it's more ethical or something. It's business, and of course it's about money.
Yes in reference to their market domination. It's a combination of DLSS and Raytracing. On the raster end, they haven't significantly outperformed the competitors prior to the 4090. The feature set and the support in games is what has accelerated their market dominance.

DLSS should run on all GPUs just like XESS and FSR2. Other GPUs might not run it as well but its better for consumers, developers, etc. At this rate, it's just a matter of time till DLSS goes by the wayside. A viable alternative will arrive that runs on all systems. At this point in time, it appears to be XESS but it needs some work. However once a definitive alternative arises, there's no reason to support DLSS anymore because it's vendor specific. If the alternative is 90-95% as good, then the extra work required to support DLSS is a waste of time.

It looks to be around 720p.
I somehow doubt that especially on the Series S.
 
Yes in reference to their market domination. It's a combination of DLSS and Raytracing. On the raster end, they haven't significantly outperformed the competitors prior to the 4090. The feature set and the support in games is what has accelerated their market dominance.

DLSS should run on all GPUs just like XESS and FSR2. Other GPUs might not run it as well but its better for consumers, developers, etc. At this rate, it's just a matter of time till DLSS goes by the wayside. A viable alternative will arrive that runs on all systems. At this point in time, it appears to be XESS but it needs some work. However once a definitive alternative arises, there's no reason to support DLSS anymore because it's vendor specific. If the alternative is 90-95% as good, then the extra work required to support DLSS is a waste of time.

Well, I disagree that DLSS is the main factor of NVIDIA GPU's current market dominance, but I guess we'll just have to agree to disagree.
On whether DLSS "should" run on all GPU or not, again I think it's a moot point. If it runs bad on other hardwares, there's no point for NVIDIA to provide such support. Supporting other hardwares costs money, and there's no reason to do that if it won't be used. Of course NVIDIA can open source it and let others to port it over, but IMHO the main advantage of DLSS comes from the results of the huge training data NVIDIA did (and still doing), and I don't think it's fair for NVIDIA to give that away for free.

About the cost and complexity of a game supporting DLSS, my understanding is that if you decided to do one of them (e.g. XeSS) then you already have the data requied by DLSS so it's not really very hard. Again, I don't think that's a huge problem. NVIDIA did try to start an initiative to make a common framework for all upscaling techs. I believe that Intel agreed to participate but AMD declined. If such framework exists I think it should help a lot with the complexity of supporting each upscaling techs.
 
Likely the red "light" isn't a light at all, and instead is just an emissive object. This will both tend to produce more noise, and also if the objects are too small they can't be captured very well in some of the Lumen data structures. This is less an issue of RT or not and more an issue of pushing the system too far on the emissive front. This is discussed in the UE documentation and several other sources online; there needs to be a good balance of lighting from analytic sources even if some (ideally larger) emissive objects can contribute as well.

https://docs.unrealengine.com/5.0/en-US/lumen-technical-details-in-unreal-engine/ (search for the various Emissive comments on there)

Thank-you for the explanation, and that makes perfect sense. It was driving me nuts trying to figure out why some scenes (in hindsight, ones completely devoid of emissives) were lit smoothly and well, and others were just awash in noise even at the highest settings. It does look like the developer was just mis-using emissives.

Another issue with this title is that there seems to be no consistency as to how the emissives work. Some of them don't appear to trace against the SDF representation at all, like this one which appears to be screenspace only:


There are also other lighting bugs that don't seem related to emissives, like this one where I'm standing in the exact same spot, my character position isn't moving at all, but when turning the camera only, there's a spot that 'snaps' the nearby objects (the storage cage + biohazard translucent barrier) to nearly fullbright, even though there doesn't appear to be any light source nearby, emissive or otherwise that would cause it to happen.
Moving the character a bit forward or backwards causes that part of the scene to stay dark/shadowed, as looks to be intended, mimicking the other storage cage on the other side.

Some more strangeness about emissives, some have an obvious distance cutoff to its addition to the scene, and that distance cutoff is really close (at the beginning of the below video, the speckled red dots all over the floor.) However, opposite of the screenspace-only emissive in the first video in this post, this one appears to be tracing successfully against the SDF, as the temporally unstable red splotches activate even when the emissive itself is offscreen.


Some other common bugs I noticed were textures not loading in properly, even on a 24GB 3090Ti, 32GB of system ram and a fast PCIe 4.0 NVMe SSD, with VRAM usage never over 10GB. Reminded me of the issues Insomniac had with Spider-man originally. Leaving the room and then coming back in, little chunks of the texture will unload, and others will load, seemingly at random. I spent ten minutes or so spinning around and leaving and re-entering the room, and only once did it appear to load the entire set of textures on the whiteboard. In my video there's one chunk of it, very bottom of the whiteboard in the center that never loads at all.


For all my complaining, I can see the potential in UE5, there are some points in the game, usually when there are zero emissives around, where it truly looks fantastic. Unfortunately in this particular title, that seems to be less than 10% of it. The buggy emissives are everywhere.
 
After seeing Immortals of Aveum, Remnant 2, and Fort Solis, the premise of the thread really shines through. All a bunch of mediocre looking games with middling art design yet they're all built around upscaling technology and demolish hardware. Until now, I was seeing upscaling tech as a big bonus. It seems it'll become mandatory moving forward, rendering native resolution obsolete.
Reconstruction was always going to become normalized. I've been saying this for years now. It's becoming downright wasteful to NOT use it.

And no, this doesn't support OP at all. It's wild after the past couple years of games releasing with issues, y'all are somehow suggesting this new batch of games proves that upscaling is just a crutch and is just covering for lazy/incompetent developers, as if games releasing with heavy demands/issues is somehow a completely new thing. It's such a lazy, short-sighted take.

As always, some devs will get more out of certain tech than others. Reconstruction raises the ceiling of what is possible with any given piece of hardware, and we will absolutely reap the rewards of this. Some devs not doing a great job is not proof that upscaling is bad or is only a 'crutch' or that it proves developers are all just lazy and incompetent now.
 
Reconstruction was always going to become normalized. I've been saying this for years now. It's becoming downright wasteful to NOT use it.

And no, this doesn't support OP at all. It's wild after the past couple years of games releasing with issues, y'all are somehow suggesting this new batch of games proves that upscaling is just a crutch and is just covering for lazy/incompetent developers, as if games releasing with heavy demands/issues is somehow a completely new thing. It's such a lazy, short-sighted take.

As always, some devs will get more out of certain tech than others. Reconstruction raises the ceiling of what is possible with any given piece of hardware, and we will absolutely reap the rewards of this. Some devs not doing a great job is not proof that upscaling is bad or is only a 'crutch' or that it proves developers are all just lazy and incompetent now.
Awesome strawman from you. OP has never said that all devs are lazy and incompetent. He said that it has become a crutch which it most definitely has given the latest examples. It doesn't mean that everyone has suddenly turned incompetent and that we will no longer see well-optimized games, it simply means that many developers will now use this as an easy way out instead of properly optimizing their games.

As for it being downright wasteful not to use it, no. It still isn't perfect and has issues that not everyone finds acceptable over native resolution. I'm not one of those people but native resolution should still be an option until upscaling surpasses it in every metric and effectively makes it obsolete.
 
Reconstruction was always going to become normalized. I've been saying this for years now. It's becoming downright wasteful to NOT use it.

And no, this doesn't support OP at all. It's wild after the past couple years of games releasing with issues, y'all are somehow suggesting this new batch of games proves that upscaling is just a crutch and is just covering for lazy/incompetent developers, as if games releasing with heavy demands/issues is somehow a completely new thing. It's such a lazy, short-sighted take.

As always, some devs will get more out of certain tech than others. Reconstruction raises the ceiling of what is possible with any given piece of hardware, and we will absolutely reap the rewards of this. Some devs not doing a great job is not proof that upscaling is bad or is only a 'crutch' or that it proves developers are all just lazy and incompetent now.

I agree, especially with how good advanced reconstruction techniques like DLSS are. It seems like an odd place to dig in your heels and insist on wanting ground truth, when basically every other thing done in the rendering pipeline is some sort of compromise for accuracy vs performance.

"Why use MSAA when we could just super sample the image and super sampling looks better?" Ain't nobody got the performance budget for that.
"Why use denoisers when raytracing when you could just shoot an order of magnitude two more rays and get closer to ground truth?" Ain't nobody got the performance budget for that.

I'd much rather the performance budget be spent on making an image temporally stable and lit and shadowed accurately, versus trying to brute force 4k or 8k worth of pixels to make it slightly sharper.

Take Fort Solis for example, I was running it at 1440p native, but I would have much rather run it at 1080p on my 1440p monitor even without any advanced upscaling techniques like DLSS or FSR2, if it meant they could throw that performance budget towards making all the emissives temporally stable - traced against a higher resolution SDF, higher sample count, more budget spent on denoising, etc.
 
Reconstruction was always going to become normalized. I've been saying this for years now. It's becoming downright wasteful to NOT use it.

And no, this doesn't support OP at all. It's wild after the past couple years of games releasing with issues, y'all are somehow suggesting this new batch of games proves that upscaling is just a crutch and is just covering for lazy/incompetent developers, as if games releasing with heavy demands/issues is somehow a completely new thing. It's such a lazy, short-sighted take.

As always, some devs will get more out of certain tech than others. Reconstruction raises the ceiling of what is possible with any given piece of hardware, and we will absolutely reap the rewards of this. Some devs not doing a great job is not proof that upscaling is bad or is only a 'crutch' or that it proves developers are all just lazy and incompetent now.
Congratulations on refuting an argument I or no one made.... You built a beautiful strawman and took it down admirably.
 
I agree, especially with how good advanced reconstruction techniques like DLSS are. It seems like an odd place to dig in your heels and insist on wanting ground truth, when basically every other thing done in the rendering pipeline is some sort of compromise for accuracy vs performance.
DLSS being good is highly subjective. It introduces artifacts, ghosting, etc, reduces image clarity(artificially increasing the sharpness does not mean better). In some games it works well, in others, it works poorly. You may like it, others may not.
"Why use MSAA when we could just super sample the image and super sampling looks better?" Ain't nobody got the performance budget for that.
"Why use denoisers when raytracing when you could just shoot an order of magnitude two more rays and get closer to ground truth?" Ain't nobody got the performance budget for that.
An argument no one ever made.
I'd much rather the performance budget be spent on making an image temporally stable and lit and shadowed accurately, versus trying to brute force 4k or 8k worth of pixels to make it slightly sharper.

Take Fort Solis for example, I was running it at 1440p native, but I would have much rather run it at 1080p on my 1440p monitor even without any advanced upscaling techniques like DLSS or FSR2, if it meant they could throw that performance budget towards making all the emissives temporally stable - traced against a higher resolution SDF, higher sample count, more budget spent on denoising, etc.
And that's the key in bold. What you prefer doesn't necessarily equate to what others prefer. Your preferences are not the beginning and ending of the discussion at hand. Some of us have different preferences.
 
It still isn't perfect and has issues that not everyone finds acceptable over native resolution. I'm not one of those people but native resolution should still be an option until upscaling surpasses it in every metric and effectively makes it obsolete.
Stop saying "native" resolution as if it means anything. As I've explained, it has been a myth for a while and building arguments based on the choice of which terms were upsampled in some previous games (without even calling out half res particles, half res AO, quarter res diffuse GI, and a zillion other things that have been common forever) feels arbitrary. We can certainly talk about blur, ghosting, denoising, frame rates and so on which are all inter-related, but picking "4k" as an arbitrary metric just doesn't make sense anymore. Is 4k with 1 shadow sample per pixel better than 2k with 8 shadow samples per pixel? If you let games settle with a static camera you're effectively getting far higher sampling rates than you ever used to with "4k", but of course fast motion can cause ghosting, smearing and other issues. These are all complicated tradeoffs and reducing them to some basic idea about "4k native performance" is just misunderstanding both current and past rendering pipelines.

I don't think any game has shipped yet without the option to disable upscaling, although I imagine it will be coming eventually (just like past games often didn't support full res particles, GI, pixel-matched shadow map sizes, etc.). It's just very expensive because the per-pixel work has gone up a lot with dynamic lighting, GI and detailed geometry. Sure for some people looking at a high resolution texture on a flat polygon is basically all they need to think something is "detailed" (as some of the examples in this very thread demonstrate), but I think for the majority the move towards better pixels with more accurate and more dynamic lighting and scenes is a good thing.

Ultimately the complaint here has nothing to do with upscaling... it's a mix of conservative bias and legitimately diminishing returns in the eyes of the average gamer. Dynamic GI doesn't cost a little more than baked lighting, it costs several orders of magnitude more. PT Cyberpunk looks a little better than regular RT Cyberpunk for a huge additional cost. Certainly lots of games have and will continue to look great with baked lighting, but there are many game designs that are simply not compatible with that, and even in cases where baked lighting probably would have worked for the shipped product, the dev resources required to do it would have compromised the content in other ways.

If you're one of the folks who is happy with the look of older games with baked lighting and high resolution textures, that's perfectly fine. But I'd wager that many of us also want games that are bigger, more detailed and more dynamic. There is space for all of these to exist, just like there are still good 2D and retro styled games.
 
Last edited:
However, opposite of the screenspace-only emissive in the first video in this post, this one appears to be tracing successfully against the SDF, as the temporally unstable red splotches activate even when the emissive itself is offscreen.
That's the thing - analytic lights gets proper injection and tracing in the Lumen data structures. Emissive objects only get considered in the final gather. Thus they are very "cheap" (if not "free"), but if they don't get hit by various traces, or are too small to be represented in the various data structures, they simply won't show up at all. This is precisely why they shouldn't really be used as primary illumination, and why small ones produce excessive noise or are even lost entirely when offscreen (and thus can't be seen by screen traces).
 
Stop saying "native" resolution as if it means anything. As I've explained, it has been a myth for a while and building arguments based on the choice of which terms were upsampled in some previous games (without even calling out their feelings on half res particles, half res AO, quarter res diffuse GI, and a zillion other things that have been common forever) feels arbitrary. We can certainly talk about blur, ghosting, denoising, frame rates and so on which are all inter-related, but picking "4k" as an arbitrary metric just doesn't make sense anymore. Is 4k with 1 shadow sample per pixel better than 2k with 8 shadow samples per pixel? If you let games settle with a static camera you're effectively getting far higher sampling rates than you ever used to with "4k", but of course fast motion can cause ghosting, smearing and other issues. These are all complicated tradeoffs and reducing them to some basic idea about "4k native performance" is just misunderstanding both current and past rendering pipelines.

I don't think any game has shipped yet without the option to disable upscaling, although I imagine it will be coming eventually (just like past games often didn't support full res particles, GI, pixel-matched shadow map sizes, etc.). It's just very expensive because the per-pixel work has gone up a lot with dynamic lighting, GI and detailed geometry. Sure for some people looking at a high resolution texture on a flat polygon is basically all they need to think something is "detailed" (as some of the examples in this very thread demonstrate), but I think for the majority the move towards better pixels with more accurate and more dynamic lighting and scenes is a good thing.

Ultimately the complaint here has nothing to do with upscaling... it's a mix of conservative bias and legitimately diminishing returns in the eyes of the average gamer. Dynamic GI doesn't cost a little more than baked lighting, it costs several orders of magnitude more. PT Cyberpunk looks a little better than regular RT Cyberpunk for a huge additional cost. Certainly lots of games have and will continue to look great with baked lighting, but there are many game designs that are simply not compatible with that, and even in cases where baked lighting probably would have worked for the shipped product, the dev resources required to do it would have compromised the content in other ways.

If you're one of the folks who is happy with the look of older games with baked lighting and high resolution textures, that's perfectly fine. But I'd wager that many of us also want games that are bigger, more detailed and more dynamic. There is space for all of these to exist, just like there are still good 2D and retro styled games.
For one, I never said 4K. I don't even game at 4K. I specifically mentioned native resolution as a reference to the developers of Remnant II who said that they developed the game with upscaling in mind. The issue this introduces is that GPUs that are traditionally made for 1080 will have a base resolution of 720p with DLSS Quality which sees the problems of technologies such as DLSS and FSR exacerbated to a point that might downright be undesirable for users. As you duly mentioned, issues such as ghosting and smearing can become awful.

The premise of the thread is fairly straightforward; by their looks alone, those games should run better, and upscaling should further increase performance at the expanse of the usual issues, it shouldn't be mandatory to get acceptable frame rates at 4K or all the way down to 1080p on a 4090 (although I'd say 4K is perfectly fine, it's just too demanding). No one complained about PT Cyberpunk or hell, even the original Cyberpunk, needing upscaling to run at playable frame rates because we can clearly see the payoffs and understand the reasons behind this. When I look at the middling graphics at Immortals at Aveum, I'm simply puzzled as to why it is so demanding and practically needs upscaling to run decently. This is without mentioning Remnant II which apparently saw a patch increase its performance by up to 50%, further hinting at the rushed nature and lack of optimization surrounding the game if accurate.

To top it all off, the settings of Immortals of Aveum scale poorly and even dropping them to the lowest doesn't claw back that much performance. You wouldn't see any of us complaining if those games looked amazing.

Overall, I still prefer native resolution (and mainly game at 3440x1440) over using DLSS but almost always choose DLSS because to me, the benefits are 100% worth it but this isn't the case for anyone and until upscaling techniques beat native resolution in every conceivable way (or most, really) then I still think it is paramount to design games around this and view upscaling as a bonus.

I'm no game developer so your vision is likely very different from mine but as a layman who only uses UE5 as a hobby, these are my thoughts on the matter.


 
Stop saying "native" resolution as if it means anything. As I've explained, it has been a myth for a while and building arguments based on the choice of which terms were upsampled in some previous games (without even calling out half res particles, half res AO, quarter res diffuse GI, and a zillion other things that have been common forever) feels arbitrary.
This was a conversation here some years back when pixel counting was vogue. 'Native Res' in the end only referred to 'Opaque Geometry'.
Ultimately the complaint here has nothing to do with upscaling... it's a mix of conservative bias and legitimately diminishing returns in the eyes of the average gamer.
I don't know that I fully get BitByte's argument, but I think it basically comes down to, "if upscaling didn't exist, this game would be so bad it wouldn't be released. Upscaling has dug the developers out of a hole and does so for a lot of devs who should otherwise crash-and-burn to bring game QA up to yesteryear standards."

The problem is the economics of game development. The solution at present isn't to solve those economics but to render lower and lower res and just upscale it to something stomachable.
 
Back
Top