RT performance mode - how hard is it to implement? Doesn´t it make more sense than resolution?

ZebMacahan

Newcomer
I´ve realised that the reason I´ve never choosen fidelity mode in any games since I got my OLED is because that OLEDs handle 30 fps very poorly.

I personaly favor RT and other visual features more than resolution, and considering OLEDs handle 30 fps poorly, wouldnt it make sense to have an RT perfomance mode in most games? 30 fps looks like a blurry mess when theres motion on an OLED så what use is that resolution anyway?

From a developer standpoint how difficult is it to implement this mode? How do perfomance gains from lowering the resolution weigh against the perfomance cost of RT and other visual eyecandy used in fidelity modes?

From a marketing standpoint I guess its good have a game running at 4k, so I guess thats that too.
 
I´ve realised that the reason I´ve never choosen fidelity mode in any games since I got my OLED is because that OLEDs handle 30 fps very poorly.

I personaly favor RT and other visual features more than resolution, and considering OLEDs handle 30 fps poorly, wouldnt it make sense to have an RT perfomance mode in most games? 30 fps looks like a blurry mess when theres motion on an OLED så what use is that resolution anyway?

From a developer standpoint how difficult is it to implement this mode? How do perfomance gains from lowering the resolution weigh against the perfomance cost of RT and other visual eyecandy used in fidelity modes?

From a marketing standpoint I guess its good have a game running at 4k, so I guess thats that too.
I am not technically inclined so I am not going to give you a technical answer.

But "RT" requires a number of factors beyond arbitrary resolution and graphics settings as a tradeoff. Although RT usage can be affected significantly by GPU power in certain scenarios.

The simple answer to your question is that RT is just taxing on hardware to begin with so the tradeoffs depending on what RT effect you are using will inevitably have a cut in performance to run on consoles.

We already see games focusing on cutting resolution to 1080p and below to get certain RT features on at 60fps with FSR to bring the quality back up but in certain scenarios that won't be enough.

So what your asking for(a mode where resolution is cut in favor of trying to support RT) already is being done. it's still in many cases not enough to keep 60fps

What I can tell you for sure is that for games going forward, resolution is dead. Dynamic resolution and temporal upscaling, or both at the same time is what developers will increasingly prioritize because it gives the GPU a lot more headroom. So whatever you see in the game regarding 30fps or 60fps will be the limit of what is possible for the machine and not something being prioritized in favor of a higher resolution, as image quality is no longer tied to native res or even super high internal res
 
Last edited:
That OLED comment of "very poor handling" of 30FPS is so real and the reality is that in console development, 30fps will always be around.

Back in 2017, I remember trying God Of War on PS4 Pro on my friends LG C7. Now, mind you, 30fps is already bad in games where you have to frequently move the camera, but on that OLED it looked so choppy and seemed like I could see every frame in certain camera movements. It was almost unplayable to me.

From that day forward, I knew that I had to stick with mid to high end LED/LCD TVs in case I wanted the option to play a game in its fidelity mode.

To answer your question, here is how I look at the situation:

Obviously, game development is hard and resource constrained. The resource can be time, money or, in the case of your dilemma, availability of top tier low-level programmers/engineers.

As hardware gets more powerful and graphics get better it becomes more complex to develop games to take advantage of the many optimizations within several different sets of architectures.

For example, Mesh Shaders would greatly increase geometry draw in many games...but at what "cost"?

-Will the game take longer to make hence increasing monetary costs?
-Will it introduce other issues within the renderer or be a debugging nightmare?
-Does the studio even have a graphics lead who has experience to implement such a thing?

I apply part of this line of thinking with fidelity vs performance modes...a lot of devs don't have the talent, time or tools to get RT in a performance mode.

Take UE4 for example. Epic doesnt have a "preset" within the engine for devs to easily take advantage of RT when targeting 60fps modes, so we haven't seen it (...in console games with complex geometry, animations or levels) since it's too intensive and would require a low level engine programmer with experience on whatever console or API to optimize.

This isn't throwing shade on any devs in particular since not every dev has or needs an RT specialist, an upscaling specialist, an I/O specialist etc.

So yeah, I guess that was just a long way of saying that I think 60fps RT modes on current gen consoles will be the exception and not the rule.
 
Thanks for your answers.

Personally I always choose the RT perfomance modes in ratchet, miles morales and doom. Seem like the perfect balance for me.

I hope more games would have this option, but I understand this wont be the case for different reasons. I think I have to start looking for a new TV :/
 
If you want things like RT and at least 60fps then you'll have to build a gaming PC and plug that in to your TV as it's never going to be a standard thing on console.

Although I've enjoyed reading about this 30fps OLED issue as it's something I've never seen people speak about and something I wasn't aware of.
 
It's possible to have RT at 60fps on consoles. Even a few multiplatform games have decent RT at 60fps.

It's just that for now some publishers (wrongly) think having better graphics at 30fps will sell more. It's just a business decision because obviously RT at 60fps takes more time and optimization from a developer point of view so publishers think it's not worth it.

But they are wrong IMO as 30fps gaming is clearly not the future, even on consoles, as diminishing returns is very strong with the current visual fidelity possible on those consoles.
 
It's possible to have RT at 60fps on consoles. Even a few multiplatform games have decent RT at 60fps.

It's just that for now some publishers (wrongly) think having better graphics at 30fps will sell more. It's just a business decision because obviously RT at 60fps takes more time and optimization from a developer point of view so publishers think it's not worth it.

But they are wrong IMO as 30fps gaming is clearly not the future, even on consoles, as diminishing returns is very strong with the current visual fidelity possible on those consoles.
I would be fine with 60fps gaming as standard(ie all games using that target from start of development), and an optional 30fps mode that adds things on top, but I don't think people would be happy going back to 30fps only.

Maybe if the pro consoles of last gen didn't exist and devs started off this gen with 30fps games as usual. Then it would be acceptable. But now that console plebs have tasted higher fps, I think people would be mad without that option now because the standard has already changed

With it's featureset, ue5 will have a distinct advantage. Even if you have to cut your polygon budget in half and lower your res target significantly to reach 60, TSR and nanite essentially still allow the polygon budget to remain high and the resolution to still be clean enough to fit 4k screens
 
If you want things like RT and at least 60fps then you'll have to build a gaming PC and plug that in to your TV as it's never going to be a standard thing on console.

Although I've enjoyed reading about this 30fps OLED issue as it's something I've never seen people speak about and something I wasn't aware of.
And not just any gaming pc, but a high end one, with Nvidia card, that have large enough memory

At least until the new Radeon RT benchmark has came out
 
60 fps with RT might not be possible on the CPU side of things for all games. BVH maintanence and the extreme added amount of drawcalls can seriously limit the ability to hit higher frame-rates consistently.

That is why Spider-Man's RT 60 fps mode on PS5 turns down the crowd density to lower than base PS4 levels, to claw up CPU performance to make RT more possible at 60. Not all games can do that though of course and hit the visual goals they want too at the same time.
 
60 fps with RT might not be possible on the CPU side of things for all games. BVH maintanence and the extreme added amount of drawcalls can seriously limit the ability to hit higher frame-rates consistently.

That is why Spider-Man's RT 60 fps mode on PS5 turns down the crowd density to lower than base PS4 levels, to claw up CPU performance to make RT more possible at 60. Not all games can do that though of course and hit the visual goals they want too at the same time.
This is probably why software lumen will be so important this gen. RT can be a drain on performance in the best of times and RT effects can be very difficult to implement while sticking to a 60fps target on lower end hw. So the next best thing is a less accurate but good enough approximation based directly on RT that is cheaper to utilize and anyone can make use of it by default.

Then again my understanding may be wrong
 
And not just any gaming pc, but a high end one, with Nvidia card, that have large enough memory

At least until the new Radeon RT benchmark has came out

Depends what you call high-end. Consoles sit around 3700x/RX6600XT, 3060Ti will be much more capable RT-wise and you'd generally want a fast Zen3 or better.
 
This is probably why software lumen will be so important this gen. RT can be a drain on performance in the best of times and RT effects can be very difficult to implement while sticking to a 60fps target on lower end hw. So the next best thing is a less accurate but good enough approximation based directly on RT that is cheaper to utilize and anyone can make use of it by default.

Then again my understanding may be wrong

Is lumen only for diffuse light or does it handle reflections as well?
 
I've tried all the very heavy RT games like Dying Light 2, Chernobylite, CP2077 and Lego Builders.
Is that with or without DLSS? Even with DLSS, it's not necessarily true. The new portal rtx, quake rtx, and maybe even minecraft rtx will bring it below 1080p60. Personally speaking, I have a 3080 and I consider it to be a 1080p60 card for raytracing. At least, that's where I expect it to settle after next-gen games utilizing RT start coming out.
 
Is that with or without DLSS? Even with DLSS, it's not necessarily true. The new portal rtx, quake rtx, and maybe even minecraft rtx will bring it below 1080p60. Personally speaking, I have a 3080 and I consider it to be a 1080p60 card for raytracing. At least, that's where I expect it to settle after next-gen games utilizing RT start coming out.

Quake 2 RTX is easy 1080p60.

Lego builders definitely needed DLSS for 1080p/60.

Chernobylite was actually OK without DLSS and was 60fps for 85-90% of the game I've played.

Dying Light 2 was the same, 60fps native is possible 70% of the time but I stopped playing the game as the image quality is dreadful. Even now I've moved to 1440p the image is still bad.

Don't have Minecraft RTX or Portal RTX.

For me the 3080 is a 1440p RT card.
 
Back
Top