Upscaling Technology Has Become A Crutch

Yeah I hear you on this one. I gathered a few in Remnant and IoA and such with the original intention of posting a few examples here but I realized that it's just going to turn into a cherry-picking fest to feed the narratives. I can find terrible looking shots in any of the games people mention. Hell I can find terrible looking photographs :p Conversely I can find really nice looking shots in most games, and definitely in the new ones. But ultimately I think the bias in these conversations is always going to swing towards people just picking whatever bad looking screenshots they can find and I don't really want to contribute to that internet cynical race to the bottom.

IIRC we used to have a thread here where people posted nice shots they had gathered from games but can't seem to find it now...

Only one I could find.
 
nVidia claims that Alan Wake 2 could run with 30 FPS in native 4K on a 4090. Only two months to the release but be able to play a game with Pathtracing with ~30FPS would be a huge archivement.
Not to disagree or undermine any achievement here but again... that's with ray reuse which is no more "4k native" than any of the other things people are complaining about. I'm sure it'll look great and ideally run great as well, but don't let anyone - especially IHVs - fool you into making apples to oranges comparisons as they find different places to hide the spatiotemporal reuse. Just compare the quality at similar levels of targeted performance from the released games.
 
There's some distance where they switch to sdf imposters, I think, but the transitions are handled incredibly well.
The trees specifically are actually Nanite and VSMs out to the horizon.

You can spot the occassional place where there's some kind of pop-in, usually near the edges of the scree, but I'm not sure what that is.
There's still some hierarchical LOD going on that cuts out some smaller objects (and most notable the heavier physics representations and so on) and merges instances and similar. This isn't primarily for Nanite's purposes (more for other parts of the engine), but keeping the total instance counts at reasonable levels does help everywhere.
 
How many people slating UE5 in this thread are angry at intel for "wasting" billions of transistors on OoOE or ever larger caches that accommodate inefficient access patterns?
 
What's that saying "If my grandmother had wheels, she'd be a bike"
Great reference.
Let’s be mindful of what you're criticizing here.

A) Remnant 2 - indie studio that has 3 price points for their game $50-$80 dollars (announced Dec 2022 - released July 2023 - studio size of 63 People)
B) immortals of Aveum - The first release of a completely new studio (founded in 2018 - released Aug 2023 - studio size of 100 People)
C) Fort Solis - another indie studio first major release (a team of 10-15 People completed development in MONTHS)
These things are not my problem. Just because a studio is small doesn't mean that they shouldn't be held to the same standards as other studios. There are other small studios that still deliver excellent products. Its not my job to defend developers and if someone does a bad job, I'll call it out regardless of the circumstances behind it. These games have objectively bad performance.
UE5 official release year was 2022!
Expecting them to provide next Gen visual experiences when they never claimed they would with UE5. It's unfair to them to hold them to that standard.
The immortal of aveum devs claimed this exact thing. Source is shown here
What all of these studios have accomplished with a fraction of the resources and time frame big budget teams have is incredible. They are pumping out incredible graphics at a fraction of the price other studios would have paid significantly more for if you go back to the UE4 days.
I'm happy you find it incredible but I most certainly do not. There's nothing impressive about the corridor shooter that is immortals of aveum where most of the environments look surprisingly similar.
Feel free to talk shit about UE5 if Coalition or Ninja Theory bomb their UE5 titles to have middling graphics. But cherry picking on these guys without taking note how little budget and manpower they had to create them, I don't think that argument works in your favour.
I'm not cherry picking, that's what's out on UE5 right now. Its what we have to judge the engine by. The other devs might have better art and animations but unless they rewrite large parts of UE5, the performance based on the evidence we have so far will be objectively bad.
 
Last edited:
If games made by smaller developers on smaller budgets cost the same as games made by the big developers on bigger budgets all comparisons are perfectly fair IMO.
 
Right but that argument is silly on the face of it. When Battlefield 4 was released no GPU could run it at 120fps at 4k, or even 60fps at 1440p. The very notion that a GPU has to run it at some arbitrarily-chosen set of settings just because that's what you're used to running last gen games at because they were targeted to GPUs with 10x less power doesn't make sense.
You keep bringing up Battlefield 4 but no one is making that argument. Secondly, it's a strawman argument. At the time battlefield 4 was released, gaming at 4k120fps was not a thing. People were still gaming at sub 1080p resolutions. Only 32% had 1080p monitors, source below. Furthermore, there were several gpus that delivered a 60fps experience at the resolutions of the time. A source is listed below. On top of that, Battlefield 4 was the best looking and most advanced game at it's time of release which is not something you can say about any of the games discussed. Finally, Maxwell GPUs released literally 3 months after release blitzing past any performance issues.

BF4 Benchmarks.pngBF4 Benchmarks 2.pngSteam-Hardware-Stats-December-2013.jpg
Not to belabor the point, but was half res particles a crutch or an optimization? Were any of the myriad of clever approximations (which in the end is all that real time graphics is) crutches or "optimizations"? It's fine to discuss individual choices in detail, but the global notion that a game has to run at X performance at Y resolution or else it's "poorly optimized" is silly. We've done reconstruction for ages - it has just gotten more clever recently.
These are not even remotely the same as what is being discussed and I don't think it's fair to make such comparisons. The devs are already using half-res effects and combining them with upscaling while upscaling from PS Vita/PS360 resolutions on consoles. You know these are not the same.
Ultimately, these games generally *do* run fine on mid-to-high end machines... and I can say that confidently because (unlike I presume many in this thread) I've actually played them. They are more expensive per pixel than games of the past, but they also look better than past dynamically-lit games.
Define "runs fine" because your definition of fine is not universal accepted. The fact that a lot of people are complaining about this means it can't just be hand waved away. This is just like when i tried to hand wave away TLOU vram usage and the pushback was strong. Turns out I was wrong and the ram usage was most certainly not justifiable. On this, i can tell you that you're wrong. The performance cost of these games when compared to their visuals is absolutely not justifiable. Go and read the youtube comments of benchmark videos of these games. Go and read other forums, the complaints are very loud and very clear.
 

Attachments

  • BF4 Benchmarks.png
    BF4 Benchmarks.png
    879.8 KB · Views: 7
Last edited:
If games made by smaller developers on smaller budgets cost the same as games made by the big developers on bigger budgets all comparisons are perfectly fair IMO.

Fort Solis is $32.50 CAD
Remnant 2 is $65 CAD
Immortals of Aveum is $80 CAD

Diablo 4 is $90 CAD
EA FC (fifa) 24 is $90 CAD

Immortals of Aveum is the only UE5 title that's kind of marketing itself as a AAA game.
 
...

These are not even remotely the same as what is being discussed and I don't think it's fair to make such comparisons. The devs are already using half-res effects and combining them with upscaling while upscaling from PS Vita/PS360 resolutions on consoles. You know these are not the same.

Define "runs fine" because your definition of fine is not universal accepted. The fact that a lot of people are complaining about this means it can't just be hand waved away. Go and read the youtube comments of benchmark videos of these games. Go and read other forums, the complaints are very loud and very clear.

I'm actually curious why you think it's an unfair comparison? Previously games rendered "opaque geometry" at native resolution from limited geometry that was optimized to have polygons cover many pixels, and then shadow maps would be generated at an independent resolution, effects/post-processing at reduced like 1/2 or 1/4. I don't really understand how varying the resolution of the "opaque geometry" is different, especially in the case where you increase the resolution of the shadow maps and the GI lighting and reflections.

Previously you'd optimize your LODs so a polygon would cover something like 16 pixels (I think?) and you'd limit the number of small triangles (smaller than 16 pixels) because they'd tank fragment generation. So you could have a native resolution of 4k, and you generate at least 1 fragment per pixel, but your geometry resolution is actually much lower if you're thinking of it in terms of pixel coverage. Curves can generate a ton of polygons so you get the issue of looking at wheels up close and being able to see the polygons on the edges. This is also why we get displacement maps, bump maps to fake geometry with limitations. The software rasterizer that Epic built, nanite, can outperform the hw rasterizer and generate fragments much faster for those small pixels, so you can actually increase your geometry resolution and have per-pixel polygons and get actual smooth curves. The geometry resolution also scales by distance so you get "perfect" level of detail without having to manage multiple LODs. I know it has performance considerations like overdraw, so putting leaves on trees, especially if they're transparent, have to be handled carefully, but that's true of any HW rasterizer, which is why trees were barren in games, or billboards, for so long. So nanite games likely have a smaller resolution of the depth, stencil, colour render targets etc, because of performance, but a higher geometry resolution per-pixel if you're looking at it from pixel coverage, if that makes sense. Also VSMs have much higher resolution than other "traditional" shadow maps.

There are just a lot of ways to look at this. You're going to have techniques like importance sampling that will vary the sample rate per-pixel to where they are most needed, and integration of samples over time. What does resolution mean if you're not talking about how many samples were taken per pixel? Like you could make a path tracer and cast one ray per pixel and limit to one bounce per intersection at 4k, or you could cast 4 rays per pixel and limit to 4 bounces per intersection at 1080p. Which of those has higher fidelity? The 1080p one would. We tend to substitute resolution for fidelity, but it's not necessarily true. 4k is more like sharpness or clarity, but not fidelity in the true sense.

Everybody has different priorities of what they want.

Some absolute psychos (30 fps is good enough :poop:) would order them:
1) pixel fidelity (ray-trace me daddy)
2) sharpness/clarity (will DLSS)
3) performance (I only take ray-tracing screenshots for forums)

4090 owners that love 4k benchmarks would order them:
1) sharpness/clarity (4k or bust)
2) pixel fidelity (ultra/epic is the only setting that matters when I screenshot my fps counter)
3) performance (It had better be higher than 60 or the devs will learn my name)

Me, being an absolute god-tier gamer, would probably order them:
1) performance (240Hz or bust)
2) performance (120Hz or bust)
3) performance (Oh god ... please run at 90 fps)
4) sharpness/clarity (turn settings low so I can see the child-gamers I'm head-shotting in fps games or fortnite, DLSS always)
5) pixel fidelity (turn up the graphics unless I'm at a disadvantage, until I'm almost gpu-limited but not quite unless the game supports reflex and then I'll barely hit the gpu-limited range )
6) it's a Remedy game so pixel fidelity is actually #1 or #2.
 
Not to disagree or undermine any achievement here but again... that's with ray reuse which is no more "4k native" than any of the other things people are complaining about. I'm sure it'll look great and ideally run great as well, but don't let anyone - especially IHVs - fool you into making apples to oranges comparisons as they find different places to hide the spatiotemporal reuse. Just compare the quality at similar levels of targeted performance from the released games.
I saw it live Just Yesterday where I could Test it on and off in Alan Wake, CP2077 and Portal RTX vs. NRD and other denoisers - DLSS 3.5 really wrecks NRD and any other denoising I have ever seen. Really impressive!

Beyond massively boosting inner surface quality and detail and light propagation from undersampling, it also cut down in Image/lighting Retention and ghosting. IMO would dramatically improve Lumen reflections and GI.

We saw a variety of non-currated use cases of specular noise/stability and ghosting and diffuse light and Shadow in movement and while still. The biggest Most impressive one beyond the Lack of specular reflections ghosting vs. Typical stuff was turning in and Off Lights. Fade Out of diffuse be gone!

We should have impressions in latest df direct
 
Last edited:
Eh, watching actual videos of people playing Immortals on lower end hardware, the performance is actually... pretty good?


It runs at 60 FPS most of the time using the high preset and DLSS Performance on a 3050. That's decent performance for a title with such high polygon count and dynamic lighting. Mind you thats just one area of the game, idk how it is in others. VRAM usage is good as well.

I think the issue is that the game doesn't scale well with its settings as medium and low don't appear to do much. And they claim you need a RTX 2080 Super as min spec, even though it runs on a Series S.
 
Eh, watching actual videos of people playing Immortals on lower end hardware, the performance is actually... pretty good?


It runs at 60 FPS most of the time using the high preset and DLSS Performance on a 3050. That's decent performance for a title with such high polygon count and dynamic lighting. Mind you thats just one area of the game, idk how it is in others. VRAM usage is good as well.

I think the issue is that the game doesn't scale well with its settings as medium and low don't appear to do much. And they claim you need a RTX 2080 Super as min spec, even though it runs on a Series S.
A Series S is only like 25-30% faster than a 1650 I think? If it looks anything like that section eesh.
 
Not to disagree or undermine any achievement here but again... that's with ray reuse which is no more "4k native" than any of the other things people are complaining about. I'm sure it'll look great and ideally run great as well, but don't let anyone - especially IHVs - fool you into making apples to oranges comparisons as they find different places to hide the spatiotemporal reuse. Just compare the quality at similar levels of targeted performance from the released games.
But i cant play current UE5 games in 4K with 60FPS on a 4090. I cant even play these games with over 100 FPS in 1440p. I have to use additional temporal informations to get to 60FPS+.

It is a race of upscaling and reconstruction. nVidia has done it with DLSS - make an advanced upscaler so good that even temporal informations from three previous frames are usable. And from the information and video they are doing it with RayReconstruction again: Make denoising so good that you can use so many different temporal informations for a stable and good looking Pathtracing frame.

I cannot compare quality at a certain performance level because the released UE5 games do not offere the same image quality. This isnt possible. And it will be worse in the future.
 
But i cant play current UE5 games in 4K with 60FPS on a 4090. I cant even play these games with over 100 FPS in 1440p. I have to use additional temporal informations to get to 60FPS+.

It is a race of upscaling and reconstruction. nVidia has done it with DLSS - make an advanced upscaler so good that even temporal informations from three previous frames are usable. And from the information and video they are doing it with RayReconstruction again: Make denoising so good that you can use so many different temporal informations for a stable and good looking Pathtracing frame.

I cannot compare quality at a certain performance level because the released UE5 games do not offere the same image quality. This isnt possible. And it will be worse in the future.

He's saying if you're targetting 60 fps, for example, just adjust the settings to get 60 fps in the games and then compare the visuals. One game could be 4k dlss performance ultra and the other could be native 4k medium. You equalize performance and then compare quality.
 
But i cant play current UE5 games in 4K with 60FPS on a 4090. I cant even play these games with over 100 FPS in 1440p. I have to use additional temporal informations to get to 60FPS+.
Same thing for Crysis back in the day. You had a monster GPU, cranked everything up to max, and the framerate struggled. We waited on better GPUs to appear to run it at higher framerates. Only if upscaling had existed back then, we could have run it at smoother framerates on those same underpowered GPUs.
 
  • Like
Reactions: JPT
I remember a huge amount of controversy when it was discovered that Killzone Shadow Fall was using an upsampling technique to hit higher frame-rates in multiplayer almost 10 years ago now. I'm sort of surprised that the same attitudes around upsampling have persisted for so long, at least in some corners of the discourse.

I just don't feel any particular attachment to the idea that a game is running at a native resolution anymore, I only care about the resolved image quality on an actual screen. And even then I do think that people oversell some of the image quality faults introduced by upsampling at times, especially sitting a decent distance from a large TV display. It's easy to pick apart problems in screenshots but somewhat harder in an actual game in motion with full-res post processing.

Both Remnant 2 and Immortals of Aveum are doing some impressive stuff. At a glance a lot of people seem to be writing off these titles as "last-gen" but I think there's a lot to appreciate, especially considering that these are very early UE5 tiles that are both targeting 60fps on consoles.
 
I remember a huge amount of controversy when it was discovered that Killzone Shadow Fall was using an upsampling technique to hit higher frame-rates in multiplayer almost 10 years ago now.
Oh yeah! Some chumps even tried a Class Action Lawsuit against Sony! though to be fair Guerilla did misrepresent its rendering.


Back when this case was first filed, Killzone: Shadow Fall developer Guerrilla Games noted that "In both SP and MP, Killzone Shadow Fall outputs a full, unscaled 1080p image at up to 60 fps. Native is often used to indicate images that are not scaled; it is native by that definition."

Thus began the arguments on what Native even means when we journeyed into the Fourth Dimension!
 
These things are not my problem. Just because a studio is small doesn't mean that they shouldn't be held to the same standards as other studios. There are other small studios that still deliver excellent products. Its not my job to defend developers and if someone does a bad job, I'll call it out regardless of the circumstances behind it. These games have objectively bad performance.
You should be comparing development budgets and time. Just because you have access to the best tools, a studio can’t be expected to make something comparable to 5-6 years development time in 1-2 years with a fraction of the studio size.

Aside from Immortals, which is still below regular prices, the other two are priced well below the standard game cost. Those are indie plus prices.

the budget and labour is still going to be the largest elements to how good a game looks, the amount of time they really have to expand on game design and art etc. you should compare them to other titles with the same amount of development time and budget.
 
He's saying if you're targetting 60 fps, for example, just adjust the settings to get 60 fps in the games and then compare the visuals. One game could be 4k dlss performance ultra and the other could be native 4k medium. You equalize performance and then compare quality.
But i dont target 60 fps. You can buy several 120hz 4K displays... 60FPS was just a number for 4K.

The problem with image quality is that UE5 is in this weird spot that is much slower than current engines and cant provide the same image quality as optimized Pathtracing. And this means i can only compare the result of the upscaling process.
 
Back
Top