GART: Games and Applications using RayTracing

Status
Not open for further replies.
Hm. Isn't Spiderman on the PS5 running at 1440p60 in Raytracing mode?

Why is the Amazing RT preset needing so much more power? The 3070/6900XT is far more powerful than the PS5.

Hopefully they won't forget about the lower tier RT GPUs with a medium RT preset that is on par with PS5 RT performance mode.

Makes me a bit nervous about the optimization work to be honest. IMO High with RT should be renamed to PS5 RT Performance Mode and then leave Ultra for the higher end PCs that can afford it. Then the spec sheet would make much more sense (PS5 RT Performance Mode -> 1440p60 on 2060(Super)) and Ultimate RT would be as it is. Much better scaling.
The RT on the PS5 version had some pretty big cut backs so it depends on how far they wan to push it on PC.

There was no reflections within reflections and the reflection of transparent objects such as foliage was massively cut back.

You can already see in the PC feature announcement video the amount of foliage and transparent objects being reflected is considerably higher than on PS5.

But I do agree with you that they need a PS5 quality setting for RT for those with ~RTX2060 level GPU's.
 
It might not need it when DLSS is enabled but will be interesting to see the impact once enabled.

I don't understand, what will DLSS do?

The game already upscales to 4k on PS5 so all DLSS will do is just enable a better quality image from the upscale and not really boost performance all that much.
 
I don't understand, what will DLSS do?

The game already upscales to 4k on PS5 so all DLSS will do is just enable a better quality image from the upscale and not really boost performance all that much.
From what Alex was saying a few posts ago the PS5 can scale all the way up from 1080p to 4K so yeah even DLSS in performance mode isn't going to bring a performance advantage at the most demanding points.

The image quality improvement (assuming it exists) over the PS5s TAAU is going be make for a very interesting comparison though.

Also whether the PC version supports DRS with DLSS as otherwise there may even be potential for PS5 to gave the better image quality in less demanding areas of the game vs DLSS P - I.e 1440p internal with TAAU vs 1080p internal WITH DLSS P.

I'd also be interested to know how DLSS with DRS if implemented would handle the need to go below a 25% internal resolution ratio to maintain the target frame rate. Can it drop further, or is that the bottom rung at which point frame rates have to take a hit?
 
I'd also be interested to know how DLSS with DRS if implemented would handle the need to go below a 25% internal resolution ratio to maintain the target frame rate. Can it drop further, or is that the bottom rung at which point frame rates have to take a hit?
I don't think that there is any limit on how low DLSS can go with native resolution. UP mode is 1/9th and people were pushing Control even below that through config edits.
 
I don't think that there is any limit on how low DLSS can go with native resolution. UP mode is 1/9th and people were pushing Control even below that through config edits.

There's no limit technically to how low it can go, but it does get to a point where the IQ drops below acceptable quality.
 
There's no limit technically to how low it can go, but it does get to a point where the IQ drops below acceptable quality.
This is true for any DRS approach, DLSS isn't doing anything new here.
PC is trickier than consoles in this as there are variable fps targets and possible bottlenecks which can't be avoided by lowering the resolution but I'd expect it to work just as it does in other titles - like Deathloop.
 
This is true for any DRS approach, DLSS isn't doing anything new here.
PC is trickier than consoles in this as there are variable fps targets and possible bottlenecks which can't be avoided by lowering the resolution but I'd expect it to work just as it does in other titles - like Deathloop.

How does it work in Deathloop? Does it hold to the target framerate while dropping internal res as low as needed or is their a lower bound for the res?
 
Holds to the framerate AFAICT.

I guess the question there is whether your system would ever need to push the resolution down below 25% of the output to maintain framerate or if you have enough spare capacity to always stay above it. I know it's not a technical limitation of DLSS itself but I'm unclear whether games can set their own upper or lower bounds. For example not all games implement UP mode as far as I can tell.
 
I guess the question there is whether your system would ever need to push the resolution down below 25% of the output to maintain framerate or if you have enough spare capacity to always stay above it. I know it's not a technical limitation of DLSS itself but I'm unclear whether games can set their own upper or lower bounds. For example not all games implement UP mode as far as I can tell.
Implementing UP mode is very straight forward and games which opt to not implement it is usually run very fast with P mode making UP unnecessary I guess.

As for DRS input resolution DLSS provide the possible range back to the application: https://github.com/NVIDIA/DLSS/blob/main/doc/DLSS_Programming_Guide_Release.pdf p.7.
 
Apparently, Turing is not worth it for RT because "it can't achieve 60 FPS in every game at max settings and native resolutions. Therefore, RDNA1 is not at disavantage here because you can't use RT on a 2060 (Super) anyway" - Steve.


It's not like this narrative was disproven countless times by the great work of @Dictator

Steve really annoys me to no end. Why must he use every opportunity he can get to downplay Turing's advantage when it comes to Raytracing with stupid BS like that?
 

Looks like a significant graphical improvement but with an enormous performance impact - probably why the devs chose to leave it out as default. The game features no upscaling tech to mitigate this, but if you have an NV card you can at least use NIS to get playable framerates and half decent image quality.
 
Looks like a significant graphical improvement but with an enormous performance impact - probably why the devs chose to leave it out as default. The game features no upscaling tech to mitigate this, but if you have an NV card you can at least use NIS to get playable framerates and half decent image quality.
With Universal Unreal Unlocker it's possible to use TAAU/TSR.

r.temporalaa.algorithm 1 to switch from 4th gen TAA to 5th gen (aka TSR).
r.screenpercentage 75 or whatever scaling you want.
 
Looks like a significant graphical improvement but with an enormous performance impact - probably why the devs chose to leave it out as default. The game features no upscaling tech to mitigate this, but if you have an NV card you can at least use NIS to get playable framerates and half decent image quality.
Maybe the effects run at full screen resolution, which could be why they hurt performance so much.

Aren't there commands like r.RayTracing.Reflections.ScreenPercentage and roughness cutoff that can be applied to the game to optimize performance?
 
Status
Not open for further replies.
Back
Top