Upscaling Technology Has Become A Crutch

Now that DX11 is basically EOL, I think upscaling is being used to eat up the DX12 performance tax. Majority of devs don't have the time and/or know how to write high performance DX12 code.

Remnants 2 is odd because it hugely benefits from scaling. Just going from native 4K to 4K/DLSS Quality boosts performance by like 60% which I don’t think I’ve ever seen. I generally see it in the range of 20-30%.
That's about the norm for games that don't have uncommon bottlenecks.
 
See the problem is it doesn’t matter if it’s baseline ue5 performance. They chose to use nanite and virtual shadow maps. Rewinding back further, they chose to use UE5 in the first place. That’s a choice they were free to make but if I look at the games graphics and compare it to its performance, I consider it to be very poor. To me, that is unacceptable and the to top it off, I’m asked to use reconstruction techniques to play an average looking game at 1440p on a $2000+ GPU. Switching gears to console for a moment, imagine if reconstruction wasn’t available on consoles? What ungodly resolutions would console players have to endure?

At the end of the day, it’s a game and you have to make smart trade offs. If you’re going to deliver this level of performance on a 4090, the graphics must be revolutionary and Remnant 2 is not that….

It's kind of interesting the difference in perspective. I wouldn't really call this an optimization problem. I don't know if I could say Remnant 2 is poorly optimized, at least not yet. Maybe an investigation or some profiling could convince me. I see it as more of a trade-off problem, and a lot of people don't like the trade-offs, which might be a little damning of UE5 in general. Nanite and virtual shadow maps solve real problems, they're just fairly expensive. With nanite you can have complex geometry with very little LOD pop-in, and then virtual shadow maps give you a unified shadowing system for dynamically lit environments that is compatible with nanite. So what we end up with is a case of people looking at the quality of the models in the environment and making a judgement that it's not worth it, even though it does solve a lot of image stability and image quality issues. If you pay attention to the videos, you won't see the typical problems you get with pop-in of LODs or weird shadow-distance pop-in etc. At least it's not easy to spot if it's there. Basically it seems like most people would trade-off image quality problems like pop-in and LOD changes for more resolution.
 
I’m curious, what would you classify as revolutionary graphics? Anyone can chime in too
OG crysis levels of wow.
Aren't the issues specific to Remnant 2 more the choice of Nanite and virtual shadow maps? If they hadn't picked those, they could run a lot faster. What has upscaling got to do with that choice? I guess you're saying the existence of upscaling enabled them to be able to make that choice because without, framerate would be too low. Well, if they then gave up on nanite and used different tech and got a better framerate, they could then add in reconstruction and make the results even better!

Upscaling isn't the deciding factor here. Probably it was a desire to use a latest-tech for promotional purposes - the first non-Epic Nanite game, right? It's getting lots of coverage as a result.

It basically comes down to good games and not so good games. Other devs putting in more effort will out-compete on visuals and get the sales. That's how this business has always operated. If consumers chooses to buy lower-effort creations, that's on them. If I were making a AAA game, I'd be intending to use upscaling. I'd crush the GPU and get every ounce of performance from it and create the best looking game ever. I wouldn't waste performance getting an extra 10% pixel fidelity from rendering native.

Do you think Marbles RTX would look and run better without upscaling?
The bolded is exactly what I'm saying. Its lead to a huge amount of poor tradeoffs which in turn has led to a bunch of games releasing with suspect performance.
 
OG crysis levels of wow.

The bolded is exactly what I'm saying. Its lead to a huge amount of poor tradeoffs which in turn has led to a bunch of games releasing with suspect performance.

But are other games releasing because of bad tradeoffs or because they're actually doing things in a bad way? Wasn't jedi fallen order allocating memory in the middle of a frame, or something like that? That's not really a trade-off. It's just a architecture error.
 
It's kind of interesting the difference in perspective. I wouldn't really call this an optimization problem. I don't know if I could say Remnant 2 is poorly optimized, at least not yet. Maybe an investigation or some profiling could convince me. I see it as more of a trade-off problem, and a lot of people don't like the trade-offs, which might be a little damning of UE5 in general. Nanite and virtual shadow maps solve real problems, they're just fairly expensive. With nanite you can have complex geometry with very little LOD pop-in, and then virtual shadow maps give you a unified shadowing system for dynamically lit environments that is compatible with nanite. So what we end up with is a case of people looking at the quality of the models in the environment and making a judgement that it's not worth it, even though it does solve a lot of image stability and image quality issues. If you pay attention to the videos, you won't see the typical problems you get with pop-in of LODs or weird shadow-distance pop-in etc. At least it's not easy to spot if it's there. Basically it seems like most people would trade-off image quality problems like pop-in and LOD changes for more resolution.

It’s somewhat similar to the debates on RT except with RT you can usually turn it off. The lukewarm reaction to Remnant 2 seems to have as much to do with art style as it has to do with the tech.
 
It’s somewhat similar to the debates on RT except with RT you can usually turn it off. The lukewarm reaction to Remnant 2 seems to have as much to do with art style as it has to do with the tech.

I'm going to pick it up in a week or so when a friend of mine has time to start playing it, so I'll be able to play around with it. The first game was A+ for gameplay, and I've only heard good things about this one. I'm very much a 120 fps or bust player, so it'll be interesting to see if I can find a combination of settings that works for me. I have a feeling I'll be stuck something like 1440p DLSS balanced, shadows low, everything else high, and even then I'm not sure I'll hit my target.

One thing that's kind of ironic is you have a technology like nanite that can display very complex geometry more efficiently than a traditional raster pipeline, but then everyone has to turn the resolution down for performance ... which kind of defeats some of the point. I'm genuinely curious to see what I think of it in person.
 
OG crysis levels of wow.

The bolded is exactly what I'm saying. Its lead to a huge amount of poor tradeoffs which in turn has led to a bunch of games releasing with suspect performance.
That's never going to happen again. Certainly not without some huge leap in technology to allow for bigger jumps between technological generations. Games have hit plateau of human labor
 
That's never going to happen again. Certainly not without some huge leap in technology to allow for bigger jumps between technological generations. Games have hit plateau of human labor

Not to mention Crysis tan horribly when it came out. That kind of leap now would absolutely bomb with the price of gpus.
 
That's never going to happen again. Certainly not without some huge leap in technology to allow for bigger jumps between technological generations. Games have hit plateau of human labor
Maybe and that's fine I guess.
Not to mention Crysis tan horribly when it came out. That kind of leap now would absolutely bomb with the price of gpus.
Crysis didn't run well when it came out but you could clearly see why. Remnant 2 on the other hand doesn't run well on the most powerful GPU out right now and it's not really justifiable.
 
I'm super curious about impressions of remnant 2, so I've been watching some vids. This one is great entertainment.


This guy says he doesn't want better graphics, he just wants performance, and then gets mad that he has to set the graphics to low to increase performance and leaves it on high anyway. Lol.
Because the cutbacks to low are way to far. Imagine having to use low on a 3080 with DLSS performance to get an unstable 60fps on a game that looks like Remnant 2. There's no point and so it's not surprising he just went back up to high. Btw, he's 100% correct. On console, the game runs at 792p with reconstruction and even with that, they can't lock to 60fps. These devs are trying to take us back to the ps360 era with these resolutions. It's honestly a giant joke.
 
Last edited:
I'm trying not to spoiler the game for myself, but scanning some of the environment videos, they can look pretty nice. Some of the areas, especially that tutorial area, don't look as good as others.




 
Because the cutbacks to low are way to far. Imagine having to use low on a 3080 with DLSS performance to get an unstable 60fps on a game that looks like Remnant 2. There's no point and so it's not surprising he just went back up to high. Btw, he's 100% correct. On console, the game runs at 792p with reconstruction and even with that, they can't lock to 60fps. These devs are trying to take us back to the ps360 era with these resolutions. It's honestly a giant joke.

His logic is a bit weird. If all he cares about is performance so much that he says it should even have a "cartoony" art style if that's what it takes to achieve that, then I don't understand how setting it to low are "too far." It's hard to find good comparison pics, but the low setting looks like it turns off virtual shadow maps, which gives a pretty large performance increase. It also seems to disable screen-space reflections. The game doesn't have global illumination. The low performance does seem a bit low to me, but i'm also not great at judging 4k performance. 4k dlss performance is normally quite a bit slower than native 1080p, but I'm not sure how much that normally scales, and if adjusting the post processing settings normally takes care of that. I guess I'm not really sure if it's just post processing that's done after upscale, or if there are other things depending on the game. A 4k screen with a 3080 is a pretty horrible combo if all you care about is performance. 1440p definitely a sweeter spot, but the majority of pc performance people use 1080p native for a reason.

I'm actually really tempted to pick up the game a bit earlier than planned just to check the performance in the tutorial area.
 
I can't really comment on the software side. However it is certainly a crutch over on the hardware side. I feel that Nvidia has based the entirety of the 40x0 series sales on the back of DLSS 3.0.

I have a 3080 and have used dlss and I always choose native over dlss unless I have no choice but to use it. I haven't found an instance of where dlss looks better than a higher native resolution.
 
Back
Top