Intel XeSS upscaling

Upscaling at this point has likely been baked into the performance targets for many games. Capcom couldn't release Monster Hunter Wilds in the state it was in without upscaling. And the game doesn't even look very good. This to me is basically using it as a crutch. I'd have a different opinion if game looked amazing or ran with acceptable performance for the mediocre visuals.
 
Sure they could. They'd just remove a bunch of heavier effects or make the "Ultra" level completely unplayable. The presence of upscaling tech does not lead to that in any way.
To be clear, I think DLSS is amazing. But like anything, it can be abused. There's a lot they could've done, but what they did IMO is make the assumption that upscaling and framegen could polish this turd. It's right there in the system requirements. They claim framegen is necessary to hit 60fps and that's not how it's supposed to work.
 
To be clear, I think DLSS is amazing. But like anything, it can be abused.
No, it can't, and this whole suggestion is stupid beyond belief.
DLSS (and others) allow developers to get more (effective) power out of the h/w which isn't in any way different from getting more power from new h/w.
Games which can't run well without DLSS are similar to games which can't run well on old h/w. In both cases these games can be badly optimized and look like nothing special, upscaling isn't bringing anything new to this landscape.
What is does bring though is the ability to get playable framerates in games which would otherwise run at <10 fps, and I dare anyone to say that path traced CP2077 or FRT Indiana Jones "look mediocre".

Framegen is a completely different topic.
 
No, it can't, and this whole suggestion is stupid beyond belief.
DLSS (and others) allow developers to get more (effective) power out of the h/w which isn't in any way different from getting more power from new h/w.
Games which can't run well without DLSS are similar to games which can't run well on old h/w. In both cases these games can be badly optimized and look like nothing special, upscaling isn't bringing anything new to this landscape.
What is does bring though is the ability to get playable framerates in games which would otherwise run at <10 fps, and I dare anyone to say that path traced CP2077 or FRT Indiana Jones "look mediocre".
I don't think those games look mediocre and I've never heard anyone make that argument.

You're correct that the problems with MHW are not the fault of DLSS. It's on Capcom, who managed to make a game that looks meh and runs poorly even with modern upscaling and frame generation.
 
Last edited:
You're correct that the problems with MHW are not the fault of DLSS. It's on Capcom, who managed to make a game that looks meh and runs poorly even with modern upscaling and frame generation.
And this is exactly what I'm referring to with some developers using it as an excuse to skip proper optimizations. I don't know why people turned this to be somehow about DLSS, this is XeSS thread and the discussion was about scalers in general.
 
No matter what the upscaler method (and I agree generally with your DLSS callout in an XeSS thread) I think @DegustatoR 's post directly addresses your complaint with developers skipping "proper optimizations": Which isn't anything new or in any way related to upscaling specifically. So this whole train of thought is just false.

Because his statement is 100% true. "Unoptimized" games have existed for literally decades, upscaling and framegen are now just the next new boogeymen to blame.
 
Last edited:
No matter what the upscaler method (and I agree generally with your DLSS callout in an XeSS thread) I think @DegustatoR 's post directly addresses your complaint with developers skipping "proper optimizations": Which isn't anything new or in any way related to upscaling specifically. So this whole train of thought is just false.

Because that's 100% true. "Unoptimized" games have existed for literally decades, upscaling and framegen are now just the next new boogeymen to blame.
Still I really don't want devs recommending we use framegen achieve 60fps. Stuff like that is what I mean when I say they're using it wrong or using it as a crutch. Using a crutch doesn't mean crutches are to blame for broken legs.
 
It feels like fidelity has largely stagnated and games looking no better now require upscaling to reach even same performance as bit older, similar fidelity titles at same representation resolution. That.
That's because games has gotten bigger and more complex over time. A common mistake is players comparing two titles with different scopes and thinking they are the same, and then complain why they don't look or perform the same.

Sebbi explained this problem in a holistic way, by comparing two similar games: FF7 Remake (part 1) and FF7 Rebirth (part 2), one is rendered sharply at near 4K native, while the other is rendered blurry by upscaling from 1080p to 4K. The thread is quite long, so I will be posting it here, while highlighting important bits in bald.

I recently finished FF7 Remake and have 50 hours of playtime in FF7 Rebirth too. It's interesting to compare these two games as they use the same character models, but the first is PS4 game remastered to PS5 and has much more limited environment. While sequel is big open world.

One of the things Digital Foundry didn't address in their video was the scale of games getting bigger. Modern games are already 100GB+ in size. It's simply not possible to bake lightmaps to a large scale outdoor game. New techniques are required to meet visual goals.

In FF7 Rebirth (sequel) the early game happens in towns (Nibelheim and Kalm) and has lots of indoor scenes. It's immediately apparently that graphics is more blurry (more upscaled) and lighting is worse quality compared to FF7 Remake in indoor scenes.

I am playing on PS5 Pro, so I haven't done a RenderDoc capture, but it's apparent that FF7 Remake relied a lot on baked lighting, while the sequel does dynamic lighting to support wast open world space. And this is a clear trade-off in indoor scenes. Perf and quality suffer.

But once you get to the open world sections of the game, the graphics look fantastic. View range is huge and there's actual content in the distance, not just some artist authored low poly backdrop. That's the big difference between corridor games and open world games.

The geometric density in FF7 Rebirth outdoor scenes is massive. There's lots of rocks, debris, and foliage. This combined with long view range and dynamic lighting is expensive for the GPU. Which is why they had to do big trade-offs to achieve these dense open world visuals.

Feels like everything is running on the limit in 60Hz mode. Upscaling is heavy in some areas. Image looks significantly fuzzier compared to FF7 Remake's sharp 4K. Apparently it's 1080p -> 4K upscaled (with PSSR). And dense geometry also requires aggressive LOD (popping).

I can't blame people complaining about the game on base PS5. Even on PS5 Pro it feels that the developer pushed dense outdoor gfx a bit too much. In FF7 Remake the 60Hz mode was flawless, but in Rebirth it's sometimes stretched too far: LOD popping, shadow issues, blurry image.

I think one of the problem is that UE5 is geared towards these big dynamic outdoor games (such as Fortnite) and majority of game developers are adapting these techniques, even though their corridor shooter doesn't require such technology. They could still bake lighting, etc.

So, was the FF7 Rebirth tech a mistake? FF7 Remake looks a bit better and runs faster compared to Rebirth in indoor scenes. But it's important to realize that visual looks (better/worse?) and being technically demanding isn't the same thing.

FF7 Rebirth is a $200M project with 155GB of content. Free roaming open world game vs corridor-style Remake. Minimap in Remake clearly shows the area where you can move. Rebirth minimap doesn't show walkable area, since you can roam mostly freely....


Lighting baking and textures/LODs can be heavily optimized if the play area is known in advance. If you have limited play area you can calculate the minimum distance to geometry outside the area, and store their lighting/texture info at much lower quality.

Also in section based games (like FF7 Remake), artists often author backgrounds separately for each section. It could be as simple as a cubemap. For example: FF7 city has two layers. Slums under a big floating plate. The area above/beyond you is just a static background image.

Imagine how hard it would be technically to allow superpowers that let you fly between the slums and the top parts of the city seamlessly. They would all need to be real geometry. This would not look any better if you just walk in the ground, but would cost lot more GPU time.


Freedom of movement, and fast movement options in big worlds (such as Chocobos) add a lot of technical challenges to rendering and LOD tech. If you want such features in your game, they come at the cost of significant added CPU and GPU time.

It sometimes feels like a massive waste of artist authored content to run fast with a Chocobo in FF7 Rebirth past all of these dense detailed environments. $200M of content, and I just run past it. I could stop to look at the detailed geometry, but who does that?

I feel that in older games a much bigger portion of the artist-authored content was important. Devs could not afford to produce a massive amount of content. Every piece of content had to matter. But large environments need fast travel or the player gets frustrated.

In older games fast travel between important locations was often implemented using different tech. There was a simpler world travel mode or just a map to select your destination instead of a massive seamless 3d world. Simpler tech was sufficient and saved lots of money.

So, why city/indoor areas in FF7 Rebirth don't use the old tech? FF7 Remake was a 100GB game, and Rebirth is 150GB. Rebirth has a lot of cities and locations, not just one. It simply would not be possible to store all that lighting data at same quality.


Another potential reason is that FF7 Rebirth has lots of seamless mixed indoor/outdoor areas. There's no loading transitions. Dynamic lighting system must run most of the time, so you might as well use it for all geometry. A single universal solution is also easier for artists.

155GB download is already quite big. Wondering what would be the limit for gamers here? Would a 300GB game with baked lighting be preferable over a 150GB game with dynamic lighting? Game with baked lighting would run faster / have less blurry upscaling.

If we look outside the FF7...IMHO the biggest change in game dev is transition from tech-first to design/art-first. Teams used to have their own tech and game/art was designed to be a perfect fit with that tech. This has changed drastically.

Nowadays game/rendering tech is so mature that it doesn't limit game ideas or artists. Everything is more design/art driven, instead of tightly guided by programmers / tech-artists. This leads to decisions that are not technically the most optimal.

Most teams nowadays use tech that's not in-house built just for their game. Tech isn't a 100% fit for the game. And team doesn't have 100% technical understanding how to create optimal content for that tech. They also have less resources to customize the tech for their needs.

Generic engines like Unreal and Unity make the situation worse, since these technologies need to cater the needs of thousands of different games and countless of game genres. They can't be optimal for every game. Fortnite needs dynamic solutions for building, does your game?


 
Still I really don't want devs recommending we use framegen achieve 60fps. Stuff like that is what I mean when I say they're using it wrong or using it as a crutch. Using a crutch doesn't mean crutches are to blame for broken legs.
Can anything available today hit 60FPS at 4k native in Cyberpunk 2077 with path tracing enabled? And would you call it unoptimized? Howabout the same two questions for Indiana Jones? Also, I'm actually curious to see any dev anywhere telling people to turn on Framegen; do you have examples?

Let's be careful when we paint word pictures with really broad brushstrokes, because nuance is still a thing.
 
I hope you realize that stuff like "AI upscaling is free pass for devs skipping even thinking about optimizing their games" can be applied to any h/w advancement in equal measure. Which means that anyone saying that should also be saying the same thing about any new GPU launch bringing performance improvements for example.

This is so old and tired. Everyone that says this never seems to be able to provide a citation.
 
This kinds of arguments as some said are very old. For example, in the early days improvement of CPU by generation was so huge, people didn't pay too much attention on optimization but instead on piling more features and meeting release schedule. There were a lot of memes about buying a faster CPU just for Microsoft (or other software companies) to slow it down. This trend was only somewhat slowed down when CPU no longer had such performance improvements by generation. It's just, well, GPU's turn now.
 
Monster Hunter Wilds. As homerdog said Frame Gen is listed in the system requirements, there's also a pop-up to turn it on in-game.
TBF this is hardly a requirement, the game works completely "fine" without FG - as in you still get playable framerates and the issues it has are also still there with or w/o FG. So FG is in no way solving these optimization issues for this title in particular which again highlights the fact that bad optimization isn't related to the availability of upscaling or framegen tech.
 
Monster Hunter Wilds is terribly optimized and it bothers me that they recommend we use framegen to hit 60fps as if that's an acceptable remedy for shitty optimization. And they obviously thought it is fine if they put it in the system requirements as the recommended way to play the game. I don't hate framegen or upscaling or anything like that.
 
Back
Top