Should full scene RT be deprioritised until RT solutions are faster? *spawn

The latest example that I can recall of a modern game going to such extremes was the PC version of Tomb Raider Legend, where the graphics features could go all the way between PS2-like feature sets to full DX9 ps360 rendering, all with the apropriate re-touched models, textures, lightmaps, SFX, hand crafted to get the best result out of each rendering paradigmes.
I remember Half Life 2, during development Valve had to create many rendering systems, spanning 3 different DirectX versions: DX7, DX8 and DX9, the differences between these 3 APIs were vast, each requiring different data storage, programming language and rendering approach. In the end Valve had to create 9 rendering systems and spent a very long time making sure all of them look consistent.


My goodness it looks so much better. Is this modded?
No, just default raster lighting vs path tracing.
Is that PT off or RT off? The lighting in the off pic looks very flat for per pixel RTGI.
Yes PT and RT off vs PT on.
 
That's on the developer. Trust me.. I'd love games to not stutter more than have every pixel path traced up to 25 bounces... I think we have bigger problems.. but we don't get to decide how things run and how things look. We only get to buy things or not buy them. The industry isn't going to stay stuck in the past because some people don't like a bit of ghosting, or whatever.
Well, clearly the users are staying stuck in the past.... If we believe steam, users are playing old games more than ever and we're almost 5 years into a new generation. Then again, it's not about staying in the past. As someone who bought the ps5, series x, ps5 pro, rtx 3080, then 4080, this generation on average has yielded one of the most unremarkable jumps in graphics I've ever experienced. For the most part, RT has done absolutely nothing to address that. Personally speaking, I'm no longer interested in funding experiments that don't yield tangible gains.
I can think of a lot of games which look better with RT. Most games are pretty good about allowing you to turn stuff off. If you're looking for fun to play games, I suggest not worrying about graphics at all. Cyberpunk is a bad example because you're not being forced to play with RT or PT.
Cyberpunk was referenced because for many, it represented a notable jump in graphics. For me, it's a game that is carried mostly by it's art and city design while having no readily comparable reference point. There's no other game in that type of setting at that scale. However, it terms of rendering, barring RT/PT, it's quite unremarkable.
This is the cost of advancing technology... it always has been. Devs/Pubs are incentivized to utilize new technologies. It what keeps the industry going, not just from a technological perspective, but it keeps the developers interested as well. People need reasons to buy new GPUs and CPUs.. Games with old technology start running too fast on all current GPUs, so some new graphics technology has to come out which will improve things but drops the old GPUs to their knees and thus requiring people to purchase new ones to build back up again. Having GPUs capable of this level of Ray Tracing massively improves productivity... being able to do it in real-time for games is incredible. Yes, the average person may not be able to see what the difference is in a game with baked GI and RTGI.. but we can always cherry pick scenarios which work one way or the other.. at the end of the day, the dev makes that call.

Don't get me wrong though, I understand your perspective. I just think that it's always been the case where something advances which improves things in some ways and regresses things in others.. but eventually it all catches up for the betterment of the medium.
Like I said, I'm not against paying for the cost of advancement... However, if all you have to show for it is slightly better shadows, rt reflections, and real-time gi as opposed to baked gi, I'm not interested.
 
I don't see why we should blame RT for poor implementations of tried and tested methods like SSR. If they're tried and tested it should be easy enough to ship a good implementation. I think you're putting the blame in the wrong place.
I'm not blaming poor RT implementations. It's a case of trying to do everything and doing nothing well.
Yep and it also takes time for developers to wrap their heads around new stuff. Unfortunately they don't have the luxury of doing that in some offline incubator while shipping games based on mature tech. They're going to try and fail and learn and improve over time. As an end user I think the ROI on graphics improvements in the past 5-10 years has been really poor. Games in 2025 don't look "that" much better than games from 2015 yet perform far worse even without RT. But I also understand that we're in a period of diminishing returns with current graphics APIs.
That's a fair take for sure and I'm ok with Devs taking their time to learn. I'm just not going to personally fund their learnings anymore. I'll wait till ROI is good again to start spending like I previously used to.
 
Btw I find the title of the thread misleading. It's not about full scene RT being abandoned. Eventually we want to reach a point where the hardware is good enough so we can achieve full scene RT or even better Path Tracing without relying on old methods
 
/yawn

When Doom3 released, no GPU could run it at max settings.
When Crysis released, no GPU could run it at max settings.
When the Witcher 3 released, no GPU could run it at max settings.

Plenty of games have done this, the only difference is that back then, people adjusted their settings down, but today people whine and want to stop progress because their expectations to their GPU is broken.

Boring :sleep:
 
Btw I find the title of the thread misleading. It's not about full scene RT being abandoned. Eventually we want to reach a point where the hardware is good enough so we can achieve full scene RT or even better Path Tracing without relying on old methods
Present an alternative and I'll change it! (one of the perils of people posting OT in other threads instead of starting new ones, they are at the mercy of the mods' interpretation when the tile is formed).
 
Present an alternative and I'll change it! (one of the perils of people posting OT in other threads instead of starting new ones, they are at the mercy of the mods' interpretation when the tile is formed).
I don't know. RT implementation case by case basis until hardware is ready?
 
/yawn

When Doom3 released, no GPU could run it at max settings.
When Crysis released, no GPU could run it at max settings.
When the Witcher 3 released, no GPU could run it at max settings.

Plenty of games have done this, the only difference is that back then, people adjusted their settings down, but today people whine and want to stop progress because their expectations to their GPU is broken.

Boring :sleep:
You are totally off the mark. At least in the case of Doom and Crysis these games set new examples in rendering techniques but were not the norm. The majority of games respected the hardware that was available.

Crysis for example was a game that sold itself on the high end, the future and targeted hardware of the future. A lot couldn't run it but it wasn't selling itself on that. This affected sales and thus CT scaled back on the sequels.

Devs weren't aiming as high as Crysis as a norm and yet technology evolved until hardware was ready.
 
/yawn

When Doom3 released, no GPU could run it at max settings.
When Crysis released, no GPU could run it at max settings.
When the Witcher 3 released, no GPU could run it at max settings.

Plenty of games have done this, the only difference is that back then, people adjusted their settings down, but today people whine and want to stop progress because their expectations to their GPU is broken.

Boring :sleep:
I think this would qualify as a low effort post no? On top of being a low effort post, it's just false equivalency. Show me in the past where Nvidia released an x80 product with worse performance per dollar than the previous x80 product and a gen on gen performance increase of 15%? I'll wait. Let's not forget the rapid rate of gpu improvement in those days render this statement completely useless. In 2024, a $3000 5090 gets you a whole 28 frames in cyberpunk 2077 at 4k with PT.

While we're at it, let's fact check your claims:

Doom 3:
3Kq3R8ZCvqDncMqxYqU5iJ-1200-80.gif
LCRMc2o8sYxJvRSghSS4Ce-1200-80.gif
Witcher 3:
witcher-bench-1080-u.jpg

It appears that your 66% of your claims stray far from the truth but, then again, I can't say I'm surprised.
 
I think this would qualify as a low effort post no? On top of being a low effort post, it's just false equivalency. Show me in the past where Nvidia released an x80 product with worse performance per dollar than the previous x80 product and a gen on gen performance increase of 15%? I'll wait. Let's not forget the rapid rate of gpu improvement in those days render this statement completely useless. In 2024, a $3000 5090 gets you a whole 28 frames in cyberpunk 2077 at 4k with PT.

While we're at it, let's fact check your claims:

Doom 3:
View attachment 13120
View attachment 13121
Witcher 3:
View attachment 13122

It appears that your 66% of your claims stray far from the truth but, then again, I can't say I'm surprised.

Try finding max setting, not numbers tailored to lower settings, I wrrte max settings and then you post numbers at lower resolution (1080p)/No AA:
Doom 3:
1739357057892.png

The witcher 3:
1739357211549.png

You do understand MAX settings right?
(Because your posting actually supports my claim about people reducing settings to have playable FPS 🤷‍♂️ )
 
Try finding max setting, not numbers tailored to lower settings, I wrrte max settings and then you post numbers at lower resolution (1080p)/No AA:
Doom 3:
View attachment 13123

The witcher 3:
View attachment 13124

You do understand MAX settings right?
(Because y
Attach files
our posting actually supports my claim about people reducing settings to have playable FPS 🤷‍♂️ )
I can't help but laugh lol. 4k ultra in 2015 on pc as a representative resolution of the times.... Yea OK. The same goes for 1600x1200 in 2004 but like I said in my original post, I'm not surprised. Using your logic, I should post benchmarks of cyberpunk 2077 pt at 8k native so that you can see the difference between 8-12 fps and a 30fps average which many people still play at in 2025 today.... in both those examples you posted, a solid 30fps is achievable at the laughable resolutions you chose. The same cannot be said for a $3000 5090 at 8k path traced.
 
I can't help but laugh lol. 4k ultra in 2015 on pc as a representative resolution of the times.... Yea OK. The same goes for 1600x1200 in 2004 but like I said in my original post, I'm not surprised. Using your logic, I should post benchmarks of cyberpunk 2077 pt at 8k native so that you can see the difference between 8-12 fps and a 30fps average which many people still play at in 2025 today.... in both those examples you posted, a solid 30fps is achievable at the laughable resolutions you chose. The same cannot be said for a $3000 5090 at 8k path traced.

Your post just confirms my point:
No GPU can run CyberPunk 2077 at MAX settings, so adjusting settings is needed 🤷‍♂️
Even @ 4K I have to use DLSS TNN Quality + FG with Path Tracing , if I use DLAA @ 4K my FPS gets to low.

Just like with Doom 3 when I ran it back in the day.
Just like the Witcher 3, where PhysX Hairworks and 4x AA also made that impossible at MAX settings 🤷‍♂️

And we do agree that 1080P is not MAX settings? :yep2:
 
Last edited:
Max settings are losing their meaning anyway. Crysis actually had visuals to justify the performance hit. These days games are crushing performance while looking like ass, and that’s before adding RT.

Either way there’s no law that says games must be playable at max settings because that’s just an arbitrary label the developer chose to use. They could easily rename medium to ultra and it seems that would make folks happy. Results and IQ apparently don’t matter as long as we can “max out” the settings menu.
 
Max settings are losing their meaning anyway. Crysis actually had visuals to justify the performance hit. These days games are crushing performance while looking like ass, and that’s before adding RT.

Either way there’s no law that says games must be playable at max settings because that’s just an arbitrary label the developer chose to use. They could easily rename medium to ultra and it seems that would make folks happy. Results and IQ apparently don’t matter as long as we can “max out” the settings menu.
I would argue that Path Tracing is CyberPunk adds significantly image quality and with the new DLSS TNN models, the image quality of CyberPunk 2077 version 2.21 is far greater than CyberPunk version 1.5 (Launch version)
I do not own Allan Wake 2, but from what I gather it also have quite an image quaility uplift at max settings.

But we have been here before (aka "But can it run Crysis?"), so this seems like a circular fallacy popping up at intervals 🤷‍♂️
 
Your post just confirms my point:
No GPU can run CyberPunk 2077 at MAX settings, so adjusting settings is needed 🤷‍♂️
Even @ 4K I have to use DLSS TNN Quality + FG with Path Tracing , if I use DLAA @ 4K my FPS gets to low.

Just like with Doom 3 when I ran it back in the day.
Just like the Witcher 3, where PhysX Hairworks and 4x AA also made that impossible at MAX settings 🤷‍♂️

And we do agree that 1080P is not MAX settings? :yep2:
In 2015, 1080p60 was max settings for me. I don't know about you but for me and most other pc gamers, 1080p60 was max. I was able to run it at max settings at 1080p if I wanted. Today, I have a 4k240 hz monitor and a 5090 can't even get me 1/8th of the way to my monitor refresh rate at max settings in cyberpunk.

There's a point where you have to read the tea leaves and be wise. We're in an Era of unremarkable gpu gains so resource efficiency should be prioritized. To me this means, using techniques that maximize the resources available on the hardware which is the opposite of what's going on. Unfortunately developers behave like we're back in the 90s where we expect rapid gpu advancement every generation. If developers or Nvidia think I'm spending 5090 money to be reliant on upscaling then, they should perhaps get their heads examined. For the asking price, you can buy a used car, buy a meaningful amount of stocks, travel internationally, etc. There are times of exponential gains/breakthroughs and times of consolidation. We're in a time of consolidation and its about time we all got with the program.
 
who remembers ?

302026
 
I think this would qualify as a low effort post no? On top of being a low effort post, it's just false equivalency. Show me in the past where Nvidia released an x80 product with worse performance per dollar than the previous x80 product and a gen on gen performance increase of 15%? I'll wait. Let's not forget the rapid rate of gpu improvement in those days render this statement completely useless. In 2024, a $3000 5090 gets you a whole 28 frames in cyberpunk 2077 at 4k with PT.

While we're at it, let's fact check your claims:

Doom 3:
View attachment 13120
View attachment 13121
Witcher 3:
View attachment 13122

It appears that your 66% of your claims stray far from the truth but, then again, I can't say I'm surprised.
I think the claim about Doom 3 is regarding its highest texture setting only running on GPUs with higher than 256 mb VRAM - which in August 2004 did not exist.
 
Back
Top