Should full scene RT be deprioritised until RT solutions are faster? *spawn

In 2015, 1080p60 was max settings for me. I don't know about you but for me and most other pc gamers, 1080p60 was max. I was able to run it at max settings at 1080p if I wanted. Today, I have a 4k240 hz monitor and a 5090 can't even get me 1/8th of the way to my monitor refresh rate at max settings in cyberpunk.

1080p/60 was absolutely not the equivalent of 4K/120 in 2015. 4K was already a thing back then and 1440p was extremely popular among enthusiasts. For reference I bought a 165hz 1440p monitor in April 2016.

Also to add some perspective 1440p is 78% more pixels than 1080p. 4K is 125% more pixels than 1440p and 300% more pixels than 1080p. The idea that we should be gaming at 4K/120hz with cutting edge graphics is kinda nuts to be honest.
 
In 2015, 1080p60 was max settings for me.
In 2015, I had a 1440p monitor, and I couldn't get max settings...
Perhaps what was max settings for you, wasn't max settings.
I still have a (now ultrawide 240hz) 1440p monitor, and max settings, are not max settings for me.
Because, I chose to tip the scales towards performance as opposed to, the choice of ppi that I opted to go for, a decade ago.

We're in an Era of unremarkable gpu gains so resource efficiency should be prioritized. To me this means, using techniques that maximize the resources available on the hardware which is the opposite of what's going on
Or, using new technology, (both in software and hardware) that can have room to grow.

Unfortunately developers behave like we're back in the 90s where we expect rapid gpu advancement every generation.
I don't think that's the case.
I think that most, understand that we need other ways to improve real time rendering.

If developers or Nvidia think I'm spending 5090 money to be reliant on upscaling...
Well, in the era of unremarkable GPU gains, you have to rely on something.
If there is an alternative, someone will certainly try to capitalize on it.
In the meantime, perhaps a 5090 is not the product for you.

If we go back a few years ago (before RT and DLSS), do you think that "max settings" meant something?
The visual impact between "high" and "ultra", didn't justify the performance impact.
I'm using quotes, because, I hope we can both agree, that "ultra" means nothing, there could be an arbitrary number of presets above "ultra", that could be either forward looking or simply more precise, and drop performance to single digits.

I am trying to understand,
Are you suggesting that, presets should be scaled down, or that the overall target in terms of rendering should be adjusted?
Are you proposing that real time graphics should remain stagnant across the industry, for the foreseeable future?
 
Last edited:
In 2015, 1080p60 was max settings for me. I don't know about you but for me and most other pc gamers, 1080p60 was max. I was able to run it at max settings at 1080p if I wanted. Today, I have a 4k240 hz monitor and a 5090 can't even get me 1/8th of the way to my monitor refresh rate at max settings in cyberpunk.

There's a point where you have to read the tea leaves and be wise. We're in an Era of unremarkable gpu gains so resource efficiency should be prioritized. To me this means, using techniques that maximize the resources available on the hardware which is the opposite of what's going on. Unfortunately developers behave like we're back in the 90s where we expect rapid gpu advancement every generation. If developers or Nvidia think I'm spending 5090 money to be reliant on upscaling then, they should perhaps get their heads examined. For the asking price, you can buy a used car, buy a meaningful amount of stocks, travel internationally, etc. There are times of exponential gains/breakthroughs and times of consolidation. We're in a time of consolidation and its about time we all got with the program.
Your setup was your setup, other people (like me eg. ) ran a higher setup at the time 🤷‍♂️
(And benchmarks shows this, as already posted by me in this very thread)
Unless your claim is that 1080p was the maxium you could buy at the time, this is a fallacious claim objectively :yep2:

And "most other PC gamers" did not run a GeForce 6800 Ultra: https://www.anandtech.com/show/1293/11
Also "most other PC gamers" did not run a GeForce GTX TITAN X https://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review
Remember, you are talking about a RTX 5090, a GPU that will cater to ~1% af the market, a flagship GPU


We are back to the "But can it run Crysis?" and thus the circle is complete :runaway:
 
who remembers ?

302026
I ran this on Amiga:
1739369622044.png
At 320x200 @ 7-25 FPS :oops:

I could not go back to this today, even if it was "bleeing egde" at the day.
 
I have a 4k240 hz monitor and a 5090 can't even get me 1/8th of the way to my monitor refresh rate at max settings in cyberpunk.
This is not the job of the 5090, you just said 1080p60 was max settings for you in 2015, if graphics have stayed the same then you could argue that a 5090 should do 2160p240 in 2025, but graphics didn't stay the same, raster graphics has advanced to the point where a 4090 can't do 2160p60 in games such as Immortals of Aveum, RoboCop, Ark Survival Ascended, Fort Soils and Horizon Lego Adventures ...etc.

In 2025, the 5090 will give you the 2160p90 in Cyberpunk Path Tracing using DLSS Performance with Rey Reconstruction, which is actually better than native, that is something you couldn't do with heavy games in 2015. If you want to max out your 240Hz then Frame Generation is available to you as well.

Besides, 4K240Hz monitors in 2025 is pretty high end, you should compare that to 1440p120Hz monitors which were the high end at that time.
 
1080p/60 was absolutely not the equivalent of 4K/120 in 2015. 4K was already a thing back then and 1440p was extremely popular among enthusiasts. For reference I bought a 165hz 1440p monitor in April 2016.
Yea, I don't know about that one chief.... Firstly, 1440p represented 1.28% of all monitors in 2015 based on the latest data I can find. That's less than the 4k resolution metrics(4.08%) I see today and I would not even dare to call 4k "popular" among enthusiasts. Except of course, your definition of enthusiasts means those who spend a lot of money which funnily enough is not the actual definition of an enthusiast.

Source 1: Tech Spot
Source 2: WccfTech
Also to add some perspective 1440p is 78% more pixels than 1080p. 4K is 125% more pixels than 1440p and 300% more pixels than 1080p. The idea that we should be gaming at 4K/120hz with cutting edge graphics is kinda nuts to be honest.
Not really but sure?
This is not the job of the 5090, you just said 1080p60 was max settings for you in 2015, if graphics have stayed the same then you could argue that a 5090 should do 2160p240 in 2025, but graphics didn't stay the same, raster graphics has advanced to the point where a 4090 can't do 2160p60 in games such as Immortals of Aveum, RoboCop, Ark Survival Ascended, Fort Soils and Horizon Lego Adventures ...etc.
Why would I look at it like that? It's a comparison to see how the best gpus of the time ran games at that time period. Yes graphics change but so does technology in general? The benchmark for a 5090 is not the games of yesteryear but the games of today.
In 2025, the 5090 will give you the 2160p90 in Cyberpunk Path Tracing using DLSS Performance with Rey Reconstruction, which is actually better than native, that is something you couldn't do with heavy games in 2015. If you want to max out your 240Hz then Frame Generation is available to you as well.
Frame generation is not performance, DLSS performance is not 4k, it's 1080p. Please do not attempt to equate unlike things and make flawed comparisons. As for the better than native comparison, yea it's pretty well known where I stand on that which I disagree with any better than native claims.
 
Last edited:
I think the claim about Doom 3 is regarding its highest texture setting only running on GPUs with higher than 256 mb VRAM - which in August 2004 did not exist.
That's fair however based on your link, we only had to wait 6 months to get a gpu which could run the highest texture settings? We're 2 gpu gens(~4+ years out) divorced from the release of the Cyberpunk 2077 and there is no gpu that can run it at max settings at 4k native.... With the rate things are progressing, we may need to wait 2-4 more gpu gens for that to exist. This would represent the slowest gpu hardware progression in history. Meanwhile, we could already at max settings 1080p on the HD 5970 a mere 3 years after release. Tbf, I can't recall if very high was the highest settings.

Source: Crysis Benchmarks

EDIT: Looking at Tech Power up, it looks like there was an extreme setting and the GTX 590 ran it fine.
Tech Powerup
 
Last edited:
Yea, I don't know about that one chief.... Firstly, 1440p represented 1.28% of all monitors in 2015 based on the latest data I can find. That's less than the 4k resolution metrics(4.08%) I see today and I would not even dare to call 4k "popular" among enthusiasts. Except of course, your definition of enthusiasts means those who spend a lot of money which funnily enough is not the actual definition of an enthusiast.

Source 1: Tech Spot
Source 2: WccfTech

Uh, like the 1% buying flagship GPUs', you are just digging your hole even bigger now 🤷‍♂️

Frame generation is not performance, DLSS performance is not 4k, it's 1080p. Please do not attempt to equate unlike things and make flawed comparisons. As for the better than native comparison, yea it's pretty well known where I stand on that which I disagree with any better than native claims.
He was talking about DLSS TNN, not FG and your opnion is just that 🤷‍♂️
People I trust to OBJECTIVELY examine things have a different perspective:

You seem to insist on posting in bad faith :nope:
 
That's fair however based on your link, we only had to wait 6 months to get a gpu which could run the highest texture settings? We're 2 gpu gens(~4+ years out) divorced from the release of the Cyberpunk 2077 and there is no gpu that can run it at max settings at 4k native.... With the rate things are progressing, we may need to wait 2-4 more gpu gens for that to exist. This would represent the slowest gpu hardware progression in history. Meanwhile, we could already at max settings 1080p on the HD 5970 a mere 3 years after release.

Source: Crysis Benchmarks
So is 6 months later at relase or it is 6 months later, you know after release?
(you are shifting the goalposts a lot and running away from your own argumetns, when confronted with facts 🤔)

And why are you so angry over a GPU you will not buy and technology you can disable in most game (so far)?
Did Doom3 make you angry too?
Did Crysis make you angry too?
Did the Witcher 3 make you angry too?
Did CyberPunk 2077 make you angry too?
 
Uh, like the 1% buying flagship GPUs', you are just digging your hole even bigger now 🤷‍♂️
Frankly, I don't know what argument you're trying to put forward.....
He was talking about DLSS TNN, not FG and your opnion is just that 🤷‍♂️
People I trust to OBJECTIVELY examine things have a different perspective:
Are you good? All caps and bold?

Secondly, read up on the appeal to authority fallacy, I even included a helpful link for you here. Try not to structure your arguments in this form. It's not particularly conducive to good discussion. Finally, I don't need DF to tell me what to think when I can test things myself and draw my own conclusions. While there are things that the new TNN model does better, there are things it does worse.
So is 6 months later at relase or it is 6 months later, you know after release?
(you are shifting the goalposts a lot and running away from your own argumetns, when confronted with facts 🤔)
Except your claim was dead in the water until Dictator came to save you with the max texture setting. A setting that really isn't bottlenecked by the gpu's ability to render and only by it's vram.
And why are you so angry over a GPU you will not buy and technology you can disable in most game (so far)?
I am not...?
Did Doom3 make you angry too?
Did Crysis make you angry too?
Did the Witcher 3 make you angry too?
Did CyberPunk 2077 make you angry too?
Did you see me posting in all caps and bold? I think you may just be projecting here but that's just me.
 
I posted this:
/yawn

When Doom3 released, no GPU could run it at max settings.
When Crysis released, no GPU could run it at max settings.
When the Witcher 3 released, no GPU could run it at max settings.

Plenty of games have done this, the only difference is that back then, people adjusted their settings down, but today people whine and want to stop progress because their expectations to their GPU is broken.

Boring :sleep:

You replied that 66% of my my post were a lie:
It appears that your 66% of your claims stray far from the truth but, then again, I can't say I'm surprised.

Care to correct your claim?
Or will you prove me correct and keep ignoring facts and shifting goalposts?
 
Care to correct your claim?
Or will you prove me correct and keep ignoring facts and shifting goalposts?
No I don't think I will. You chose a resolution for the witcher 3 that 99.3% of pc gamers didn't use and in no way would constitute even the average enthusiast's definition of max settings according to the steam hardware survey. You employed the same tactic again for Doom 3. There was nothing incorrect about my initial statements.
 
With the rate things are progressing, we may need to wait 2-4 more gpu gens for that to exist. This would represent the slowest gpu hardware progression in history.
OK, assuming that is correct, what do you propose the industry should do to course correct.
What is your alternative?
 
As for the better than native comparison, yea it's pretty well known where I stand on that which I disagree with any better than native claims.
Ray Reconstruction of any capacity is absolutely better than native when doing Path Tracing. Native runs with slow and laggy denoisers, they are blurry, smeary and have lots of loss of details and missing effects. Ray Reconstruction avoids 90% of these problems.
Firstly, 1440p represented 1.28% of all monitors in 2015
And 4K240 monitors are less than half of that percentage in 2025.

Why would I look at it like that?
You are the one who thinks he is entitled to playing games at max refresh rate, max native resolution, max settings on a high end GPU. This has never happened in the past.
You chose a resolution for the witcher 3 that 99.3% of pc gamers didn't use and in no way would constitute even the average enthusiast's definition of max settings
Just like you when you chose 4K240 as your baseline performance.
 
In 2015, 1080p60 was max settings for me. I don't know about you but for me and most other pc gamers, 1080p60 was max. I was able to run it at max settings at 1080p if I wanted. Today, I have a 4k240 hz monitor and a 5090 can't even get me 1/8th of the way to my monitor refresh rate at max settings in cyberpunk.

There's a point where you have to read the tea leaves and be wise. We're in an Era of unremarkable gpu gains so resource efficiency should be prioritized. To me this means, using techniques that maximize the resources available on the hardware which is the opposite of what's going on. Unfortunately developers behave like we're back in the 90s where we expect rapid gpu advancement every generation. If developers or Nvidia think I'm spending 5090 money to be reliant on upscaling then, they should perhaps get their heads examined. For the asking price, you can buy a used car, buy a meaningful amount of stocks, travel internationally, etc. There are times of exponential gains/breakthroughs and times of consolidation. We're in a time of consolidation and its about time we all got with the program.
I don't understand what the point you are making is. You're not actually suggesting that games should be designed to run at native 4K240Hz (16x the pixels per second of 1080P60hz) right?
 
Yea, I don't know about that one chief.... Firstly, 1440p represented 1.28% of all monitors in 2015 based on the latest data I can find.

Huh? You’re the one talking about 4K 120Hz here. What percentage is that at?
 
Ray Reconstruction of any capacity is absolutely better than native when doing Path Tracing. Native runs with slow and laggy denoisers, they are blurry, smeary and have lots of loss of details and missing effects. Ray Reconstruction avoids 90% of these problems.
Not sure why you're referring to ray reconstruction when i was talking about dlss upscaling.
And 4K240 monitors are less than half of that percentage in 2025.


You are the one who thinks he is entitled to playing games at max refresh rate, max native resolution, max settings on a high end GPU. This has never happened in the past.

Just like you when you chose 4K240 as your baseline performance.
I really didn't choose 4k240hz as a baseline. Its just the monitor i own. In fact, if you can quote me where I said that "4k240hz is the baseline", I'd appreciate it. As for playing games at max settings and refresh rate, your statement can be true or false depending on the gpu,cpu,monitor combination one has.... For example, if you only have a 1080p, 60fps monitor but for some reason have a 5090/9800x3d combo, then you can play every game at max settings, max fps.

As to feeling entitled to achieve some level of satisfactory performance after spending near $3k usd and realistically $4.5k in my currency, you better believe I'll have a high level of warranted entitlement. If you can spend that much money and be satisfied with 28fps on average in cyberpunk at native 4k almost 5 years after the game released, then more power to you. It just means we value money differently and there's nothing wrong with that?
 
Back
Top