Intel XeSS upscaling

Ok, so we have one example.

Compared to quite literally decades of games that were just as equally unoptimized and asked for insane hardware? Did everyone else forget the past 20 years of PC gaming? Really?

It's still a boogeyman + scapegoat IMO. And besides, the cat's out of the bag now. Bluntly, what would you say we're actually gonna do about it at this point? I'm glad it's here, because for those games who provide next-generation quality options, upscaling means we can enjoy those titles using all the monitor real-estate we paid for.

One thing is 100% true: no matter where we are in the technology curve, there will ALWAYS BE unoptimized code. Trying to find any sort of "reason" for it that points to a new tech is ignoring literally fifty years of programming.
 
I'm just puzzled by what people think would happen if there would be no upscaling? Somehow we would be getting CP2077 Overdrive graphics in native 4K at 100+ fps on a 2060?
 
I'm just puzzled by what people think would happen if there would be no upscaling? Somehow we would be getting CP2077 Overdrive graphics in native 4K at 100+ fps on a 2060?
Of course not. But we might be getting better performing other titles at this time and eventually we would get to the point where that CP2077 Overdrive native 4k would be doable on 2060 equivalent of some future time.

Upscaling should be added benefit for those who choose to use it, not something devs count on everyone to use.
 
Ok, so we have one example.

Compared to quite literally decades of games that were just as equally unoptimized and asked for insane hardware? Did everyone else forget the past 20 years of PC gaming? Really?

It's still a boogeyman + scapegoat IMO. And besides, the cat's out of the bag now. Bluntly, what would you say we're actually gonna do about it at this point? I'm glad it's here, because for those games who provide next-generation quality options, upscaling means we can enjoy those titles using all the monitor real-estate we paid for.

One thing is 100% true: no matter where we are in the technology curve, there will ALWAYS BE unoptimized code. Trying to find any sort of "reason" for it that points to a new tech is ignoring literally fifty years of programming.
I think they only remember the last 20 years. People don't remember the days pre-Xbox 360, when PC games were often inferior in weird ways to their console contemporaries. Compare Duke Nukem or Jaz Jackrabbit to NES or SNES sidescrollers and it's clear who had the better hardware. Hell, id famously cloned Super Mario 3 on PC and everyone was amazed by how smooth the scrolling was, even if it was half the framerate of the NES title. And many PS1/N64 3d games had less than flattering showings on PC. Try playing the PC ports of Alien Trilogy, Shadows of the Empire, or WipeOut. They are all lacking in some way compared to the console games.

It wasn't until the later half of 360/PS3 where PC really started to shine as the undisputed best performance platform. And when Xbone and PS4 launched with weaker AMD cpus, it was easy to build a PC that outperformed consoles in that regard. That was coupled with a real golden period for nVidia as well. You had a lot of games that would hit 100+fps on a GTX1050 and a somewhat modern Intel CPU when the console was struggling to hit 60. And I think this is one of the things that ruined peoples expectations for future PC performance.

I think the idea that games leveraging new technology - Stuff like ray tracing, nanite and lumen, or other forms of virtualized geometry and global illumination - should be running at blistering framerates and the highest resolutions on even the best hardware without upscaling is silly. Part of the reason we've had 2 console generations where PC was exponentially more performant is because the games were targeting console feature sets and scaling up to PC. We've started to see a trend of consoles running some features, especially RT but others as well, at lower than the lowest settings for the PC version. Also, upscaling has been more common on consoles for longer. I suspect the existence of a reasonably performant console release would lead people to believe that a PC version would run even better, except that the console version is running at lower than native resolution with some settings lower than low.
 
Upscaling can but used to
A) Make a game look better
B) Make a game run faster
C) Allow a dev to spend less time optimizing and still hit their performance target

C is what I take issue with and it's clearly not a problem with upscaling. Upscaling (DLSS) is amazing; I use it in every game it's in, and get annoyed when games don't include it.
 
On consoles we have a long history of games running at sub native resolutions, then without anti aliasing at all, then checkerboarding was invented and all games used it, and when upscaling was invented for PC, all console games started using it instead.

On PC, we have a long history of games with advanced graphics running at considerably less than 60 fps (even 30fps) at max settings with max resolution and even common resolution. People were forced to drop resolution all the time, drop Anti Aliasing altogether (because it was expensive) or drop settings completely.

Regarding the max resolution part, each PC era had a different max resolution: it was 800x600, then 1024x768, then 1280x1024 (or it's 720p equivalent), then 1600x1200 then 900p/1080p/1440p ...etc. We seem to have stopped at 4K for more than a decade now, but even though we have platued in terms of max resolution, we have increased the refresh rate considerably, so the demands from the hardware to keep up with display tech never stopped.

With upscaling, people don't have to choose between max resolution, max frame rates, max settings anymore, it's now possible to have all three at the same time with very tiny sacrifices, this is a blessing not a curse.
 
Upscaling can but used to
A) Make a game look better
B) Make a game run faster
C) Allow a dev to spend less time optimizing and still hit their performance target

C is what I take issue with and it's clearly not a problem with upscaling. Upscaling (DLSS) is amazing; I use it in every game it's in, and get annoyed when games don't include it.

Implied here is a baseline level of optimization that would be considered acceptable before upscaling is applied. Who decides what’s acceptable?
 
And now we're back to the problems with NVIDIA App deciding what "optimized" settings should be in your games.

I'll never assume my personal definition of optimal matches homerdog's definition of optimal. This isn't related to my opinion being any better than homerdog's, rather we're both unique people with our own biases and preferences and expectations.

IF an upscalar is required to hit a 60fps target, does that mean the game is unoptimized?
IF an upscalar is required to hit a 120fps target, does that mean the game is unoptimized?
IF an upscaler is required to hit a 240fps target... You get the idea.

Every single game offering an upscaler option is doing it for better performance, even the "look better" category is allowing a higher set of graphical options to perform faster than they otherwise would. How do we decide which games are unoptimized versus simply too taxing for your chosen hardware? This sounds like the Crysis nonsense all over again, really.
 
Implied here is a baseline level of optimization that would be considered acceptable before upscaling is applied. Who decides what’s acceptable?
Obviously me :mrgreen:

But for real, there is general consensus that Monster Hunter Wilds does not look good enough to justify the performance it delivers. All the technical analysis I've seen comes to this conclusion.

Given the game's astronomical sales, it seems most people don't care enough for this to affect their purchasing decisions. So maybe releasing this shite was actually the right decision from Capcom.

While Crysis also ran poorly at the highest settings, it looked generations ahead of any other game. So I never felt it's performance demands were unjustified.

Anyway I don't know that I disagree with any of you or @Albuquerque 's points. I'm not entirely sure what we're arguing about :)
 
Last edited:
Implied here is a baseline level of optimization that would be considered acceptable before upscaling is applied. Who decides what’s acceptable?
IMHO "baseline" should be averageish computers (say, Core i5/Ryzen 5 from few years back, 3060-4060 level based on Steam) at averageish resolution (1080p-1440p based on Steam) at high settings (step down or two from max, path tracing can be kept out from this completely) around 60 FPS. Obviously without any frame gens, upscalers and whatnot. And that means with todays visuals (again, not including path tracing nor even heavy RT), not something you might mistake for last gen consoles at times like Monster Hunter Wilds
Obviously there are probably about as many "acceptable baselines" as there are people, but that's my view.
 
So, isn't this a situation where people vote with their wallet? Doesn't this just devolve into the same "common sense" statements we always make about upcoming / newly released games?

Don't preorder.
Watch the reviews.
Watch for more than one review.
And then don't buy shoddy games.

Sounds like about a hojillion people went ahead and bought Monster Hunter Wilds, and continue to do so. As it turns out, apparently it's not so poorly optimized to cause folks to stop buying it.
 
So, isn't this a situation where people vote with their wallet? Doesn't this just devolve into the same "common sense" statements we always make about upcoming / newly released games?

Don't preorder.
Watch the reviews.
Watch for more than one review.
And then don't buy shoddy games.

Sounds like about a hojillion people went ahead and bought Monster Hunter Wilds, and continue to do so. As it turns out, apparently it's not so poorly optimized to cause folks to stop buying it.

It's not just that, but with more than 200K concurrent players one month after its release, many people are still playing it.
It's a good game gameplay-wise. As long as it's not to the extent to unplayable, people are willing to sacrifice beauty for gameplay. I don't think that's a unreasonable take.
But on the other hand I know a friend who was interested in buying this game (he played both open beta) but decided to delay purchase due to its poor optimization (and the fact that he still can't get a 5080). So it's not entirely without negative effects.
 
IMHO "baseline" should be averageish computers (say, Core i5/Ryzen 5 from few years back, 3060-4060 level based on Steam) at averageish resolution (1080p-1440p based on Steam) at high settings (step down or two from max, path tracing can be kept out from this completely) around 60 FPS. Obviously without any frame gens, upscalers and whatnot. And that means with todays visuals (again, not including path tracing nor even heavy RT), not something you might mistake for last gen consoles at times like Monster Hunter Wilds
Obviously there are probably about as many "acceptable baselines" as there are people, but that's my view.

Yep there’s a ton of subjectivity involved. Stuff like “high settings” is arbitrary as it could mean very different things based on the game. Also many people give extra points for art style even if it’s not impactful to the workload. The lack of optimization has to be really bad and obvious like MHW for everyone to come to the same conclusion.

I’m not even sure how we would characterize today’s visuals since not all games share the same featureset or scope. I’m guessing everyone is talking about the big budget games with great production values that make the benchmark circuit. Those are typically the games that would need upscaling. If you’re not playing those games then your 3060 is likely still crushing it and upscaling isn’t necessary.

A 3060 averages 67fps @ 1080p in TPUs latest benchmark suite with the following settings.

  • All games are set to their highest quality setting unless indicated otherwise.
  • All games are running at their native resolution without upscaling (no DLSS or FSR)
  • All games have ray tracing disabled

So on average I would say those games are pretty well optimized.
 
Yep there’s a ton of subjectivity involved. Stuff like “high settings” is arbitrary as it could mean very different things based on the game. Also many people give extra points for art style even if it’s not impactful to the workload. The lack of optimization has to be really bad and obvious like MHW for everyone to come to the same conclusion.

I’m not even sure how we would characterize today’s visuals since not all games share the same featureset or scope. I’m guessing everyone is talking about the big budget games with great production values that make the benchmark circuit. Those are typically the games that would need upscaling. If you’re not playing those games then your 3060 is likely still crushing it and upscaling isn’t necessary.

A 3060 averages 67fps @ 1080p in TPUs latest benchmark suite with the following settings.

  • All games are set to their highest quality setting unless indicated otherwise.
  • All games are running at their native resolution without upscaling (no DLSS or FSR)
  • All games have ray tracing disabled

So on average I would say those games are pretty well optimized.
They do use highest end CPU though, and average can be misleading too.
Out of 25 games they test, 3060 fails to reach 60 FPS avg at 1080p in 11 games (Taken from B570 review which seems to be the latest with 3060 results included, 12 GB model to be specific)
 
They do use highest end CPU though, and average can be misleading too.
Out of 25 games they test, 3060 fails to reach 60 FPS avg at 1080p in 11 games (Taken from B570 review which seems to be the latest with 3060 results included, 12 GB model to be specific)

Yeah that’s how averages work :) Remember this is max settings so not representative of the sensible settings most people should use.
 
Back
Top