Upscaling Technology Has Become A Crutch

That’s technical true but in the context of the DLSS discussion, this is a false equivalency. Designing a game specifically for DLSS is not equivalent to designing a game for Dx12. The consequences of doing so are vastly different and I’m frankly not sure why this was even brought up. Look at the post I quoted?

No, but designing a game with AI scaling and frame generation in mind is not. There are already other AI scaling techs there, albeit maybe not as good as DLSS for now, and there will be other AI frame generation techs too.
Furthermore, AI scaling is something that's not "mandatory." If you don't like AI scaling, you can always disable it. Of course you may have to reduce other settings (e.g. texture size, filtering quality, etc.) to maintain frame rate, but since AI scaling is basically helping with the lack of fill rate, you can view it as a matter of choice. At worst, you can play with a reduced resolution.
 
Once again with the false equivalency. The situation with Voodo accelerators is drastically different to today. Firstly, there's no reason for DLSS to be proprietary as XESS has shown us. DLSS could run on other GPUs, all that's needed is for hardware vendors to implement their own hardware accelerators. I will say it again, targeting vendor specific technology is wrong and consumers and devs will pay the price. As it stands, due to the proprietary nature of DLSS, it has essentially created a monopoly in the GPU space. We're pay through the nose for GPUs and its why you see enthusiasts cheering on Intel. Its wrong for developers to do so and they'll suffer the consequences of it.

You are basically contradicting yourself here. On one hand you are saying because DLSS is so good and people are targeting DLSS thus making NVIDIA GPU mandatory. On the other hand you are saying XeSS proved that it's possible to make something with similar quality of DLSS. If so, what's the problem? It's not that NVIDIA is preventing people from using XeSS.
 
Once again with the false equivalency. The situation with Voodo accelerators is drastically different to today. Firstly, there's no reason for DLSS to be proprietary as XESS has shown us. DLSS could run on other GPUs, all that's needed is for hardware vendors to implement their own hardware accelerators. I will say it again, targeting vendor specific technology is wrong and consumers and devs will pay the price. As it stands, due to the proprietary nature of DLSS, it has essentially created a monopoly in the GPU space. We're pay through the nose for GPUs and its why you see enthusiasts cheering on Intel. Its wrong for developers to do so and they'll suffer the consequences of it.
No it’s not. Because your commentary is around upscaling as a crutch. Was leveraging 3DFX glide a crutch during a time in which no CPU was capable of running Quake well?

DX12 supports any form of AI upscaling with DirectML, just that no one wants to build a model that does this. Everyone vendor is currently making their own black box, otherwise DirectML would be fully able to fulfill this requirement.

Epic can support all of these and if you don’t have them, you run the game at lower performance. Certainly it’s still a lot better than what we were getting with the CPU rendering days. At least you’re getting 30fps at high resolution. If you want 100fps at 4K, then you need frame generation. You need technologies that can get you there.

If I am reading the room right here; a group of people here are upset that their super expensive video cards are not getting them 100fps at 4K without it. And you’re blaming developers for them to not meeting that requirement.

How is this different than being upset back in the day that the absolute top of the line CPU could not compete against GLQuake on a Voodoo? It must be the developers fault it’s not running well on my CPU.

To me these are very parallel. The future is headed towards AI. If you don’t have it, you don’t get the performance levels that is capable with it on.
 
DX12 supports any form of AI upscaling with DirectML, just that no one wants to build a model that does this. Everyone vendor is currently making their own black box, otherwise DirectML would be fully able to fulfill this requirement.
Minor correction on this point - I don't believe DirectML is sufficient to efficiently implement something like DLSS/XeSS currently. Perhaps in the future, but today the drivers do still have special access that is required and cannot be replicated by non-IHVs.
 
TAA was invented because MSAA was incompatible with post processing shaders. Unless you’re going to super sample everything, which is a total waste of gpu, you’re stuck with TAA.
 
Minor correction on this point - I don't believe DirectML is sufficient to efficiently implement something like DLSS/XeSS currently. Perhaps in the future, but today the drivers do still have special access that is required and cannot be replicated by non-IHVs.
Makes sense! Also provides some insight as to why MS is likely not incentivized to create their own model and offer it with DirectX. It just wouldn’t be able to compete. At least not yet.

OT; not sure if you can answer but did Epic ever consider trying to make their own upscaling or AA model to use via DML? And came to the conclusion that it wouldn’t work?
 
Last edited:
TAA was invented because MSAA was incompatible with post processing shaders. Unless you’re going to super sample everything, which is a total waste of gpu, you’re stuck with TAA.
... and MSAA doesn't work efficiently with deferred shading. And shader/BRDF aliasing (specular shimmer, etc) isn't addressed at all by MSAA. MSAA only targeted one source of issues, and while it did it efficiently for the time, it makes sense to unify other sources under broader reconstruction/filtering schemes.

OT; not sure if you can answer but did Epic ever consider trying to make their own upscaling or AA model to use via DML?
I don't know, and probably couldn't comment if I did, sorry.
 
It’s not just you. For me as it stands, ue5’s performance cost when compared to the visuals is one of the most unjustifiable engine trade off I’ve seen in a long time.
UE5 is not gonna be some unique engine here. 'Next gen' always requires a VERY big uplift in general base requirements to make a significant leap in visuals and scope possible.

That said, Immortals of Aveum is clearly not a prime UE5 showcase title, though it still carries its performance burdens by using some of its most heavy tech options. There will undoubtedly be developers who make better use of it, while having somewhat similar processor demands.

We should be clear here that UE5 is gonna be used by many developers, just like UE3 and UE4 were, if not moreso. And those engines were all capable of amazing things, no matter how much lesser/newer developers were putting out games on them that were much less impressive. Gonna be the same situation here. Things are still very much just getting started.
 
UE5 is not gonna be some unique engine here. 'Next gen' always requires a VERY big uplift in general base requirements to make a significant leap in visuals and scope possible.

That said, Immortals of Aveum is clearly not a prime UE5 showcase title, though it still carries its performance burdens by using some of its most heavy tech options. There will undoubtedly be developers who make better use of it, while having somewhat similar processor demands.

We should be clear here that UE5 is gonna be used by many developers, just like UE3 and UE4 were, if not moreso. And those engines were all capable of amazing things, no matter how much lesser/newer developers were putting out games on them that were much less impressive. Gonna be the same situation here. Things are still very much just getting started.

I think its the hardware more so than UE5 although its not without its own faults. People have been complaining about the jump between the 20 to 30 to 40 series cards and well we know of amd's issues with performance esp rdna 2 to 3. I think the extreme price changes between generations of cards is making people hold on to their hardware longer. in the past when we got a new console generation we had large leaps in graphics cards and they were small if any price jumps between generations.

Who knows maybe the 50 series and whatever is next from AMD will be a sizable jump without a massive price increase and everyone will move to the new cards and we will all forget about the performance of UE5.
 
You are basically contradicting yourself here. On one hand you are saying because DLSS is so good and people are targeting DLSS thus making NVIDIA GPU mandatory. On the other hand you are saying XeSS proved that it's possible to make something with similar quality of DLSS. If so, what's the problem? It's not that NVIDIA is preventing people from using XeSS.
I'm not contradicting myself at all. DLSS was the first to kick it off and got first mover advantage. It all but consolidated Nvidia's current position. Amd had nothing and still has nothing. Intel showed up 4-5 years later so obviously they're struggling to get traction. The key is XESS is just as promising as DLSS and works on all GPUs. The only reason Nvidia GPUs appear mandatory is simply because DLSS doesn't run on other GPUs. There's no technical reason for it, it's just about money.
No it’s not. Because your commentary is around upscaling as a crutch. Was leveraging 3DFX glide a crutch during a time in which no CPU was capable of running Quake well?

DX12 supports any form of AI upscaling with DirectML, just that no one wants to build a model that does this. Everyone vendor is currently making their own black box, otherwise DirectML would be fully able to fulfill this requirement.

Epic can support all of these and if you don’t have them, you run the game at lower performance. Certainly it’s still a lot better than what we were getting with the CPU rendering days. At least you’re getting 30fps at high resolution. If you want 100fps at 4K, then you need frame generation. You need technologies that can get you there.

If I am reading the room right here; a group of people here are upset that their super expensive video cards are not getting them 100fps at 4K without it. And you’re blaming developers for them to not meeting that requirement.

How is this different than being upset back in the day that the absolute top of the line CPU could not compete against GLQuake on a Voodoo? It must be the developers fault it’s not running well on my CPU.

To me these are very parallel. The future is headed towards AI. If you don’t have it, you don’t get the performance levels that is capable with it on.
Huh, I don't even know what you're arguing anymore. You said and I quote:
Saying you want more from DLSS is fine. Saying developer suck for targeting a game specifically for DLSS is wrong imo.
My whole discussion is centered around this comment. This tangential discussion is not about upscaling in general, it's about that specific point. I mean I thought I made it very obvious by only quoting that particular line. Everything you've raised up so far has really nothing to do with the line I quoted. 3dfx has nothing to do with it and neither does dx12. A developer does indeed suck if they make a game specifically for use with DLSS and DLSS exclusively. At least, that's what I'm currently arguing against.

Finally, I think you're definitely reading the room wrong. People aren't upset that their GPUs are delivering poor performance. They're upset that visually unimpressive games have an unjustifiable performance cost. If people felt the visuals justified the cost, there would be no complaints. Not too long ago, people were crying about TLOU on pc saying it's performance cost was unjustified. It turns out that they were partially right and it was partially fixed.
 
Last edited:
UE5 is not gonna be some unique engine here. 'Next gen' always requires a VERY big uplift in general base requirements to make a significant leap in visuals and scope possible.

That said, Immortals of Aveum is clearly not a prime UE5 showcase title, though it still carries its performance burdens by using some of its most heavy tech options. There will undoubtedly be developers who make better use of it, while having somewhat similar processor demands.

We should be clear here that UE5 is gonna be used by many developers, just like UE3 and UE4 were, if not moreso. And those engines were all capable of amazing things, no matter how much lesser/newer developers were putting out games on them that were much less impressive. Gonna be the same situation here. Things are still very much just getting started.

Diminishing returns is real, and I think something like UE5 has technologies to produce results that would not be possible in other engines. Maybe other studios will come up with different solutions, but so far everyone is kind of waiting for a true next-gen game and the new consoles are almost three years old. Engines are working with the same limitations they've had for the past ten years, like raster performance limiting geometry and requiring LOD management.
 
I'm not contradicting myself at all. DLSS was the first to kick it off and got first mover advantage. It all but consolidated Nvidia's current position. Amd had nothing and still has nothing. Intel showed up 4-5 years later so obviously they're struggling to get traction. The key is XESS is just as promising as DLSS and works on all GPUs. The only reason Nvidia GPUs appear mandatory is simply because DLSS doesn't run on other GPUs. There's no technical reason for it, it's just about money.

Huh, I don't even know what you're arguing anymore. You said and I quote:

My whole discussion is centered around this comment. This tangential discussion is not about upscaling in general, it's about that specific point. I mean I thought I made it very obvious by only quoting that particular line. Everything you've raised up so far has really nothing to do with the line I quoted. 3dfx has nothing to do with it and neither does dx12. A developer does indeed suck if they make a game specifically for use with DLSS and DLSS exclusively. At least, that's what I'm currently arguing against.

Finally, I think you're definitely reading the room wrong. People aren't upset that their GPUs are delivering poor performance. They're upset that visually unimpressive games have an unjustifiable performance cost. If people felt the visuals justified the cost, there would be no complaints. Not too long ago, people were crying about TLOU on pc saying it's performance cost was unjustified. It turns out that they were partially right and it was partially fixed.
OIC I tend to use it interchangeably from time to time. Well DLSS to me should be a label expanded to all NN upscaling techniques. Any algorithm that is doing Ai upscaling and AA is likely going to be a deep learning algorithm. It’s quite unfair that it is trademarked to nvidia. I still think it doesn’t matter if a game ultimately does this, it just will reduce its total addressable market to something tiny.

The tangential discussion for me doesn’t change however, the game runs without DLSS. It just runs a lot better with DLSS and much better with frame generation. I don’t mean to say that games should target Nvidias solution only (my fault for implying that) , but there is nothing wrong with setting a graphical bar high enough such that said solutions would be required to run them at high performance and eventually required to even run.
 
Last edited:
@iroboto I don't know how many games there are that support DLSS that don't also support FSR or XESS. Not sure this is even a real issue in terms of other technologies not being adopted. I do wish Nvidia would follow the same steps as XESS and have a version suitable for other architectures.
 
After seeing Immortals of Aveum, Remnant 2, and Fort Solis, the premise of the thread really shines through. All a bunch of mediocre looking games with middling art design yet they're all built around upscaling technology and demolish hardware. Until now, I was seeing upscaling tech as a big bonus. It seems it'll become mandatory moving forward, rendering native resolution obsolete.

Additionally, the options in Immortals of Aveum scale poorly.
 
Fort Solis is another UE5.2 game with Lumen and Nanite.



Gave Fort Solis a try as I was wanting to give a UE5 game with Lumen/Nanite a go, and the sci-fi horror-ish genre is normally right up my alley.

It's certainly demanding, with the 3090Ti screaming at the 475-500w TDP most of the time, but what really stood out is how incredibly temporally unstable and coarse the GI resolve looks to be, even at absolute maximum settings. Everything looks like it's basically being lit from underneath a swimming pool with a glass bottom, as if it was doing a really good imitation of water caustics.

Shadow resolve was surprisingly coarse and noisy too, given how demanding it is on the GPU, looking at the vending machine in the first clip.
Lots of really obvious disocclusion artifacts whenever the character moves as well, looking at the second clip when he moves his arms.


Not a ton of options to change in the settings, and changing the 'Global Illumination' setting from High (the highest) to Low doesn't seem to make much of a difference either.
DLSS doesn't clean it up, and on the lower resolution DLSS settings, the rate at which the temporal instability 'swims' speeds up, making it even more obvious.

Reminds me a bit of the noisy output in Control with RT on, but since Remedy was 1st out of the gate with RT, I cut them a bit more slack, and even then, it wasn't this distracting.
 
This seems to be a problem with Lumen, direct lighting and reflections. DESORDRE has this, too. But switching to nVidia's RTXDI reduces the noise.

Same can be seen in Fortnite indoors, too. You can only do so much or better little with software GI.
 
So yeah, perhaps simpler shadows that lose resolution at distance is actually preferable over realistic shadows that come and go as you move?
Problem with pure native rendering, is that most parts of the image are not rendered at native resolution, shadow maps, global illumination and reflections are rendered at half or quarter resolution, same for Motion Blur, Depth of Field, and other post processing effects, this makes these defective elements stand out more, making the whole image out of place and "gamey". Shadows will flicker and pops out and in more visibly or appear with very jagged look, reflections also flicker, and gets blurred, sometimes even show jagged edges inside, lighting also becomes blurred and smudged. Even LoD pop in and out becomes more obvious at native. Rendering of very thin objects is also bad with lots of gaps of aliasing.

Competent upscalers already achieves the clarity of native rendering already, while having many other advantages (image stability, vastly better thin objects representation ..etc), so why bother?
 
how incredibly temporally unstable and coarse the GI resolve looks to be, even at absolute maximum settings. Everything looks like it's basically being lit from underneath a swimming pool with a glass bottom, as if it was doing a really good imitation of water caustics
Definitely a problem with the Software Lumen they are using, they need to use Hardware Lumen with max settings.

This seems to be a problem with Lumen, direct lighting and reflections. DESORDRE has this, too. But switching to nVidia's RTXDI reduces the noise.
I know about RTXDI reducing the issue, but I think max Hardware Lumen in DESORDRE does indeed reduce it too?
 
I know this isn't exactly the right thread for it - apologies, but there isn't another one for Fort Solis yet that I can see.

Another observation, which I thought strange given my understanding of Lumen to be a dynamic global illumination solution, similar to what you'd get from RT, just with some lower temporal and spatial resolution, for the most part.

However, there seems to be a very clear screen space component to the lighting. As the camera pans up, with the red light above the door in view, you can see its effect on the floor grating, with the usual GI resolve noise and all the little red 'splats' on the floor, and it matches the rest of the floor gratings brightness-wise.

However, pan the camera down, and the contribution of the red light to the lighting of the floor grating disappears entirely, and it goes very dark, almost as if it's entirely in shadow, while the floor grating to the right of it does not. Odd to see all 3 things at once: the screen space limitation of the technique used, the extremely low resolution (temporally and spatially) when it is visible, combined with the catastrophic performance.


YouTube's compression actually does a pretty good job of obscuring the temporal instability in the lighting, it's far worse and way more obvious in actual gameplay, but you can still see how the lighting changes dramatically based on the camera angle.

It's not all bad though, I think I was too quick to blame the shadow technique in my first video, shadows in the game seem uniformly pretty great.
They could stand to be a bit higher resolution, especially at close distances where the sawtooth pattern on the edges is still pretty obvious, but the lighting sure needs some work.

Looking forward to DF's analysis! :)
 
Back
Top