Upscaling Technology Has Become A Crutch

I don't feel that's the actual the take away from that discussion. The complaint was about unrealistic expectations and more so with FSR2 (lets avoid going down that IHV debate) and not that DLSS/FSR2 should not be factored in at all.

Related to expectations I also just don't feel foregoing upscaling is realistic here. If you look at the the entire video quoted they specifically discuss IQ issues with the 60 fps mode. However I just don't see the hardware leap was enough to 1080p30 -> 4k60 along with generational fidelity improvements (keep in mind we're well into the perceived improvements diminishing returns stage relative to the hardware) if we want to render everything in the so called "native resolution." As such without the current upscaling technologies there's going to have to be something else doing the upscaling. Or I guess what we just target native 1080p30 like last gen?

I think roughly the target for 60 fps should be 1080p60 upscaled to 4k60. Otherwise your image quality will be poor, but if you go to high too high you risk not having rendering power left to not look like a PS4 game. DF is right that there is some minimum you need to hit to upscale. FSR2 isn't great, and other reconstruction methods may actually be better. Spider-Man 2 is in the 1080p-1440p range in performance mode, which is probably appropriate, but I think they have better reconstruction than FSR2. I haven't watched the content to know if they're still using a custom method, but to me it's better than FSR.

I don't think any devs are "skipping" optimization and just relying primarily on upscale.
 
Last edited:
I don't feel that's the actual the take away from that discussion. The complaint was about unrealistic expectations and more so with FSR2 (lets avoid going down that IHV debate) and not that DLSS/FSR2 should not be factored in at all.
I wasn’t suggesting that at all though. I think that if they want to use upscaling, the base resolution needs to be high enough. In my estimate, you should aim to deliver a minimum of 1080p 60fps on the consoles. If you use FSR/DLSS to upscale from 1080p, it’s tolerable. Trying to upscale from 540p, 768p and all the other nonsense derivatives being used is simply unacceptable.
Related to expectations I also just don't feel foregoing upscaling is realistic here. If you look at the the entire video quoted they specifically discuss IQ issues with the 60 fps mode. However I just don't see the hardware leap was enough to 1080p30 -> 4k60 along with generational fidelity improvements (keep in mind we're well into the perceived improvements diminishing returns stage relative to the hardware) if we want to render everything in the so called "native resolution." As such without the current upscaling technologies there's going to have to be something else doing the upscaling. Or I guess what we just target native 1080p30 like last gen?
I don’t agree with this sentiment at all. No one is saying that they need to target 4k native at all. There’s more than enough power to deliver a next gen experience at 1080p60fps. I’d argue even at 1440p, there’s enough power. The issue is that you have to be extremely wise with your feature set. Developers are being rather unwise in general with their feature set.

Look at the attempts to use hardware RT? The hardware is not capable to deliver a good hardware RT implementation without eating a significant chunk of the render budget. It’s a giant waste of computational resources on console but you still have devs trying to fit a square peg into a round hole. Look at Spider-Man 2 which is praised for its RT implementation. It hardly looks like a generational upgrade from Miles Morales all for some subpar RT effects. They could have used a combination of dynamic cube maps, ssr, etc that would generally lead to a more performant effort.

In general, devs should stick to traditional raster methods on consoles until they have dedicate RT hardware capable of delivering performant experiences. Even look at AW2 on console? Remedy spending so much resources on effects making trying to make the world look so real. Unfortunately it couldn’t be more fake with the wacky physics, static’s world with barely any interactivity, low res shadows, subpar animation, etc. The image quality is so bad with fsr2, the aliasing is so poor. Then look at the ponytail physics, even tomb raider 2013 has better hair physics with tress fx or whatever it’s called. For a game where you walk slow mostly tight corridors, i feel like they could have done the minimum to hit a higher base resolution. It’s like they took the unoptimized effects from quantum break and turned them up to 11.
 
Last edited:
Recently I have been thinking about the upscaling side of this conversation, and how it relates to the increase in resolution and pixel density.
And i think i have come up with a half decent framework for this type of discussion......TV.
Given I come from working in TV/Film production background ( although not anymore ), I guess its not that surprising.
So bear with me here...

TV used to be 4:3 SD, essentially 720x576 @ 50i ( PAL) or 720x480 @59i ( ntsc ).
These resolution and pixel densities established the norms for what would be displayed on a screen, especially a TV screen.
Sure movies/Film are a perceptually a lot higher res, but thats in a cinema, at home it was still SD.
then we moved to HD, with formats like 720p and 1080i, and then onto 1080p.
Then eventually we moved to UHD/4K.

The point being that the amount of image we usually see on the screen hasn't changed much, although there is a good argument to be made
that the transition from 4:3 to 16:9 did actually increase the actual amount that a viewer would see.
What we have got though, is a huge increase in the detail of the objects we see on the screen.

So we can use this framework to discuss and compare upscaling methods, and possibly use the TV's built in upscaling as starting point.
Most people would agree that scaling SD to UHD is gonna look like shit.
What about 720p to UHD?
What about 1080p to UHD?

I like to think I've got a pretty good eye for this sort of stuff, years of watching 100% raw HD and UHD footage on reference displays has somewhat ruined me,
together with watching DF videos. But i'm probably more sensitive to bad digital TV encodes, than poor upscaling tech ala FSR or DLSS.
But most of the time i can't see the difference between a good upscale on a 1080p signal than i can a native UHD signal - when in motion that is.

It would be interesting to see what the very best offline scalers can do with SD / HD footage when scaling to UHD.
I suspect that nothing is going to be able to take an SD signal and produce a super crisp UHD output, with anything but the most basic bland image.

and thus, my actual point - expecting upscalers like FSR/DLSS/and the rest, to work from resolutions approximating SD are just unrealistic.
The amount of "stuff" shown on screen isn't increasing, but the detail IS. so we are probably OK to use upscalers, so long as the starting image has enough detail.

Currently imho that amount of detail is somewhere between 1080 and 1440p.

Again, I'd Love to see a comparison between FSR/DLSS etc and top end offline scalers, which are VERY good.
 
Was watching this young mans video on jedi survivor then his nanite one and I'm not sure i've seen his reason for upscaling use becoming a crutch mentioned. He's saying it's become needed (or a crutch) because of excessive over draw, interesting case he puts forward. Someone can cross post the video to ue5 thread if they want to discuss the nanite data, which is actually most of the video but was more curious what people thought on his reason for needing upcaling.

 
Doesn't upscaling increase overdraw for traditional hardware rasterization? The smaller the triangles in terms of pixels, the more overdraw there is, so decreasing rendering resolution while keeping the same geometry should increase overdraw. Upscaling would still provide a boost to overall performance, but if the geometry detail isn't lowered along with the resolution then overdraw issues will only be compounded.
 
Doesn't upscaling increase overdraw for traditional hardware rasterization? The smaller the triangles in terms of pixels, the more overdraw there is, so decreasing rendering resolution while keeping the same geometry should increase overdraw. Upscaling would still provide a boost to overall performance, but if the geometry detail isn't lowered along with the resolution then overdraw issues will only be compounded.

LOD management should handle that in both traditional hardware rasterization and nanite.
 
@GhostofWar that guy is outrage farming to collect donations to “fix” the game industry with ai or something.
Is anything he says in his videos valid? I've only seen that one and the starwars one. Not sure how AI fixes the game industry, sounds like a recipe for a whole industry of modern ubisoft games. I'm not saying ubisoft don't make games people want to play, just that they all feel very similar.
 
Is anything he says in his videos valid? I've only seen that one and the starwars one. Not sure how AI fixes the game industry, sounds like a recipe for a whole industry of modern ubisoft games. I'm not saying ubisoft don't make games people want to play, just that they all feel very similar.

There are definitely some valid bits. Like, quad overdraw is a real thing. I'm sure some of his commentary about TAA settings in UE and how they can be optimized to minimize blur are valid. I think his commentary about nanite and stochastic rendering are highly highly subjective. Like he'll constantly refer to old games as "photorealistic" or pursuing realism, but if you go back and look at them they look incredibly outdated. He'll, he'll even put up clips or screenshots of the games as he's talking about how great old games looked, and I wonder if his eyes are working.

I think he fails to acknowledge the limitations of older rendering technology and he just kind of hand waves them off because of performance. Like, he kind of just generally seems to believe nanite is a symptom of laziness and not wanting to deal with LOD management, instead of the fact that every single game that has LODs has visible transitions. Also doesn't really address the fact that a lot of new games that run poorly are not using anything like nanite. He even blames UE5 for the performance in Alan Wake 2 because Epic has fooled the industry into relying on upscaling or something ...

Mainly, the guy is outrage baiting to solicit money to fork UE5 and "fix" it using AI tools or something. I think he's a person that's obviously familiar with UE and a base level of technical knowledge, but basically trying to steal people's money. He can prove me wrong by releasing a game, or demos, or something that proves to me that he's the messiah he's selling himself as. As far as I can tell the guy has never released anything. No released games to his name, no demos, no nothing.
 
Yes, UE5 fooled and influenced the whole industry so that everyone would rely on the extreme upscaling. It is a hard truth that the previous games looked better due to the 4K resolution on the current consoles than the new games, which use technically overpowered UE5 graphics at a much lower resolution!
A year or two ago, even I argued in favor of UE5, and many people agreed with this, because the presentations were spectacular, and we thought that later on they could bring a higher resolution with this engine on the current generation. However, now a year later, I think it's safe to say that UE5 is not for this console generation. Many games released and here is the results....


Current example: You can try the game Off The Grid which uses UE5 with 60FPS on a console. There is even RT with rich openworld graphics, it would look great if it didn't run in about 900p (?) native resolution...


Then by loading Apex Legends into the console, we get what we are used to on these consoles, the native 4K/60FPS experience. The difference is extraordinary in favor of Apex! The many details in the new game are useless if those details are lost due to the low resolution.

It's an untenable argument that this is okay because we play on 65 inch TVs these days and resolution is more important than ever. And don't tell me Apex Legends graphics can't be beat in native 4K on Series X or PS5! I'm sure you could do it with console adapted engines. But there is UE5 with billions of geometries and shit image quality... No way!
Because of this, I only play certain 4K(ish)/60FPS games on my current console. Nice...
 
Last edited:
@QPlayer Please give me the timeline and the proof that Epic fooled and coerced the industry into using TAA, upscaling and other stochastic rendering techniques.

Walk me through the first game to use TAA, the first game to use upscaling and explain how we got from there until now and what Epic’s involvement was at each step.

Edit: bonus points if you can explain how Remedy was fooled by Epic into using a low base resolution before upscaling in Alan Wake 2.
 
Last edited:
I think QPlayer may be suggesting the same thing I was saying more than a year ago.
The early tech demonstration was, naturally, jaw-dropping. But the visuals were so precisely optimized, and the experience so limited, that not enough developers took into account that their own games would struggle to achieve high levels of visual fidelity using UE5 outside of such a constrained example.
 
I think QPlayer may be suggesting the same thing I was saying more than a year ago.
The early tech demonstration was, naturally, jaw-dropping. But the visuals were so precisely optimized, and the experience so limited, that not enough developers took into account that their own games would struggle to achieve high levels of visual fidelity using UE5 outside of such a constrained example.

He wrote “Yes, UE5 fooled and influenced the whole industry so that everyone would rely on the extreme upscaling” in response to me pointing out that this video guy blamed UE5 for upscaling in Alan Wake 2.
 
@QPlayer Please give me the timeline and the proof that Epic fooled and coerced the industry into using TAA, upscaling and other stochastic rendering techniques.

Walk me through the first game to use TAA, the first game to use upscaling and explain how we got from there until now and what Epic’s involvement was at each step.

Edit: bonus points if you can explain how Remedy was fooled by Epic into using a low base resolution before upscaling in Alan Wake 2.
A timeline? Well, if we look at when UE5 was first presented, we can see that it was 4 and a half years ago. Furthermore, I am sure that several major development studios were able to get an insight into the technological innovations used earlier, so they could start to adapt it to their own engine in time.

If we look at the reasons that led to the extreme performance requirements, we come specifically to the virtualized geometry system as the main one. We are well aware that this technique has started to be introduced more widely in UE5's Nanite. Today, a significant number of games being developed use UE5. From now on, his influence in the industry is unquestionable.

However, I have no doubts that this is a great technology (with all the other things that come with it, see VSM, Lumen), but the performance demands are beyond the hardware performance of the current generation of consoles.

In short and concisely: Once we got used to the 4K image quality at the beginning of this generation, we don't want to go back to the resolutions of a generation or two ago when it comes to new games.

At the same time, it would be possible to use UE5 with 2K or 4K/60FPS on current generation hardware without the above-mentioned excessively high-performance technologies. There is an example of this. A few... This should be used for all current generation games that require 60/120FPS, e.g. FPS/multiplayer games. Or at all, all games could be run this way at high resolution with 60FPS on current consoles.
 
Last edited:
Better upscalers make so much difference. DLSS balanced looks pretty good to me in many games at 1440p and that's a base resolutions of like 835p. Even DLSS Performance (720p ➡️ 1440p) can look okay. Switch to FSR at the same resolutions and it looks like gobbledygook.
 
Yes, UE5 fooled and influenced the whole industry so that everyone would rely on the extreme upscaling.

Just going to highlight that this is an absolutely ridiculous claim. The "whole industry" was not fooled by anyone. The Alan Wake 2 point by that Threat Interactive guy is especially ludicrous because Alan Wake 1 on 360 was criticized for basically being nearly standard definition.

Here's my version of events. There have always been game console titles that upscaled to the televisions native resolution. Nothing is new. Upscaling and TAA were not invented by Epic. The industry has not been fooled by Epic into adopting these technologies. We've seen developers use all kinds of in-house engines to do these exact things well before UE5. There were upscaled games on PS360, and I'm pretty sure on Xbox and PS2. We saw TAA on Xbox 360, with Halo Reach at least, and then plenty of titles in the PS4, Xbox One gen.

The reason why we don't see many games pushing close to native 4k on console is because there's a tension between resolution and other graphics improvements. During previous console generations launch games would always be criticized for looking like last-gen games with higher resolution. People don't really care about native resolution and never have. Where you fall into the balance between resolution and other graphical improvements is going to be highly subjective. Increasing resolution on its own is not objectively good.

Now that hardware improvements are slowing, and not as radical in terms of new technology, making a 4x jump in display resolution from 1080p to 4k basically eats up the vast majority of your rendering capability assuming you don't hit memory bandwidth or memory capacity limits first. Add in the diminishing returns of compute cost to visual improvement, then you're left with little to no choice but to consider upscaling, or make games that look like last gen in higher resolution. It's not actually anything different than the past, just the ratio of rendered resolution to output resolution is growing wider because of the hardware landscape and 4k displays. It's why 8k displays are such a joke and should be completely ignored for gaming. It's a $500 box. Not a super computer. It's not even a high-end computer.
 
Better upscalers make so much difference. DLSS balanced looks pretty good to me in many games at 1440p and that's a base resolutions of like 835p. Even DLSS Performance (720p ➡️ 1440p) can look okay. Switch to FSR at the same resolutions and it looks like gobbledygook.
Yes. FSR is only good at 1440p native resolution. DLSS is better in lower pixel density. But DLSS doesn't exist on consoles. My post is about current generation consoles.
 
Just going to highlight that this is an absolutely ridiculous claim. The "whole industry" was not fooled by anyone. The Alan Wake 2 point by that Threat Interactive guy is especially ludicrous because Alan Wake 1 on 360 was criticized for basically being nearly standard definition.
Remedy has always liked upscaling. Quantum Break had a bespoke upscaling solution, Control launched with a special version of DLSS "1.9" no other game had before getting patched to use DLSS 2.0, and now AW2 is one of the first third-party games to have PSSR.
 
Back
Top