In 2015, 1080p60 was max settings for me.
In 2015, I had a 1440p monitor, and I couldn't get max settings...
Perhaps what was max settings
for you, wasn't max settings.
I still have a (now ultrawide 240hz) 1440p monitor, and max settings, are not max settings for me.
Because, I chose to tip the scales
towards performance as opposed to, the choice of ppi that I opted to go for, a decade ago.
We're in an Era of unremarkable gpu gains so resource efficiency should be prioritized. To me this means, using techniques that maximize the resources available on the hardware which is the opposite of what's going on
Or, using new technology, (both in software and hardware) that can have room to grow.
Unfortunately developers behave like we're back in the 90s where we expect rapid gpu advancement every generation.
I don't think that's the case.
I think that most, understand that we need other ways to improve real time rendering.
If developers or Nvidia think I'm spending 5090 money to be reliant on upscaling...
Well, in the era of unremarkable GPU gains, you have to rely on something.
If there is an alternative, someone will certainly try to capitalize on it.
In the meantime, perhaps a 5090 is not the product for you.
If we go back a few years ago (before RT and DLSS), do you think that "max settings" meant something?
The visual impact between "high" and "ultra", didn't justify the performance impact.
I'm using quotes, because, I hope we can both agree, that "ultra" means nothing, there could be an arbitrary number of presets above "ultra", that could be either forward looking or simply more precise, and drop performance to single digits.
I am trying to understand,
Are you suggesting that, presets should be scaled down, or that the overall target in terms of rendering should be adjusted?
Are you proposing that real time graphics should remain stagnant across the industry, for the foreseeable future?