Nvidia DLSS 3 antialiasing discussion

Yes so? As I've said it's targeting 60 on XSS. The fact that it drops down to 30 (and below) doesn't mean that it's CPU limited, in fact it is most likely GPU limited on consoles.

A game is "targetting" 60fps if there's actually a 60fps, or at least an unlocked mode. It's targetting 30/40fps, as those are the caps. The very fact there's not a 60fps or pure-unlocked mode running at 1080p on the SX indicates CPU limitation is indeed a factor, as well as the SS holding 30fps at 1080p quite well vs the SX/PS5 only going up to 1440p at 30fps. If it was purely GPU limited then they would both be absolutely locked at 40fps easily at 1440p when the SS is running the same scenes without a drop at 30.

How is the SX not maintaining 40fps in this scene at 1440p while the SS is reaching 30 at 1080p given the vast disparity in GPU resources between the two if the CPU is not a factor?

1666035542508.png



And on the PC, here's the GPU at 84% at 54fps. Again in the video linked, which you refuse to watch.

1666035243038.png
 
Last edited:
Are we sure it's a cpu limitation ?

Just going by available evidence so far, yes CPU is a limitation in this game. I mean, is there not a CPU on the planet that can maintain it at 60fps on PC? I don't know as I haven't seen every potential CPU benchmark at every potential graphics setting. What's clear from the console numbers and the PC numbers I've seen in the video linked at others, that it is definitely a very CPU hungry game at least.
 
Just going by available evidence so far, yes CPU is a limitation in this game. I mean, is there not a CPU on the planet that can maintain it at 60fps on PC? I don't know as I haven't seen every potential CPU benchmark at every potential graphics setting. What's clear from the console numbers and the PC numbers I've seen in the video linked at others, that it is definitely a very CPU hungry game at least.
I really wonder why then, I see nothing too crazy... Time will tell I guess.
 
I was under the impression thats what you have to do with G-sync for best results set a frame cap a few frames below the refresh rate of your monitor
You want to enable vsync and cap your fps a few frames below your monitor’s maximum refresh rate.
 
So I tried DLSS3 in Spider-Man, and it looks great with very little artifcating that I noticed. I mostly noticed it on the UI elements. Very good with the controller....

......until I grabbed the mouse and swinging the camera around much faster as I would playing games with KB/M... then it became quite noticeable in cases with high contrast elements.

The tech is great no doubt, but there's work to be done for sure. The UI elements are the most distracting part however.. from my very short time looking at it. And obviously that's the thing.. I'm looking for it and I know where to look for it.


But anyway, yea I can admit I was wrong to say that you wouldn't notice it. Most of the time you probably wont, but in high contrast scenes with fast moving objects/cameras, you will notice some artifacting. Looking forward to see how it improves. With the much slower camera movement of the controller, it was way less noticeable, logically.
 
Ingame FPS Cap=60 without Reflex is very close to the Reflex scenarios on the right side, identical to the 60fps capped Reflex one.

Can you provide some evidence for this? Battlenonsenses findings contradict your claim:

The Frame Limiter is similar to Reflex in GPU bound scenes, however, in CPU bound scenes Reflex is far stronger.

1666046990722.png



The results show that while very effective in reducing overall system latency, a manual framerate limiter still isn't as effective as NVIDIA Reflex Low Latency Mode (or Reflex Low Latency + Boost). Additionally, it requires a lot of tinkering and user interaction, while NVIDIA Reflex is something you simply turn on and forget about—the game engine and the graphics driver talk with each other and continuously optimize your experience.


Few things to note: not all games have built-in frame limiters, and using an external frame limiter adds too much latency. Also, many of the games that DO have a frame limiter, end up being no that useful, as it can only cap fps to certain quantized numbers: 30/60/120 .. etc, some can't even limit beyond a certain number, 200fps for example. Reflex takes care of all of that.

You also need to find your sweet spot limit manually, play the game through all scenes, tweak the visual settings and then cap the fps way below what your GPU is capable of (to avoid being GPU bound in busy scenes), leaving performance on the table in other less intensive scenes. You don't have to do that with Reflex.

Certainly, but in a very typical competitive gaming scenario, players play with low graphics settings and the GPU will never hit close to 100%, rather the monitor or occasionally the CPU is the limiting factor
But we are not talking about Reflex in a competitive scenario only, we are talking about it in the context of DLSS3 and a wide umbrella of single player non competitive games.
 
But we are not talking about Reflex in a competitive scenario only, we are talking about it in the context of DLSS3 and a wide umbrella of single player non competitive games.
I brought up a relevant example to show how your simplistic generalization overstates the benefits of Reflex.

Reflex is a nice feature in many cases, there is no disagreement on that. If Reflex and other Nvidia tech/products are good and useful, they will stand on their own legs just fine. It's unnecessary to exaggerate them.
 
How is the SX not maintaining 40fps in this scene at 1440p while the SS is reaching 30 at 1080p given the vast disparity in GPU resources between the two if the CPU is not a factor?

Remember Series-S has more bandwidth per CU/Tflop and more GPU cache than Series-X too.

That will play a part in terms of GPU utilisation in certain situations.

With the bottom image you shown with the 3060ti could it be all those rats are entirely CPU driven on 1-2 threads which is causing the low GPU use?
 
A game is "targetting" 60fps if there's actually a 60fps, or at least an unlocked mode. It's targetting 30/40fps, as those are the caps. The very fact there's not a 60fps or pure-unlocked mode running at 1080p on the SX indicates CPU limitation is indeed a factor, as well as the SS holding 30fps at 1080p quite well vs the SX/PS5 only going up to 1440p at 30fps. If it was purely GPU limited then they would both be absolutely locked at 40fps easily at 1440p when the SS is running the same scenes without a drop at 30.

How is the SX not maintaining 40fps in this scene at 1440p while the SS is reaching 30 at 1080p given the vast disparity in GPU resources between the two if the CPU is not a factor?

View attachment 7244



And on the PC, here's the GPU at 84% at 54fps. Again in the video linked, which you refuse to watch.

View attachment 7243
Graphical settings plays an important role in the amount of compute required. So it’s not just a resolution drop we are seeing here, it’s also a steep drop off of features as well
 
Plague Tale 4K DLAA, Frame Generation Off / On:

Frame Gen Off - 74fps
Frame Gen On - 116 fps

Seems about 57% "faster" (GPU bound cases don't quite double the fps from the looks of it), I can't notice the input lag increase tbh. In the same scene without frame generation and reflex disabled input lag is worse than frame generation enabled by about 12ms. My first reaction to it is definitely that it looks smoother but I guess I'm expecting the input lag to match what I'm looking at, feels a bit weird in this game.
 
Last edited:
Plague Tale 4K DLAA, Frame Generation Off / On:

Frame Gen Off - 74fps
Frame Gen On - 116 fps

Seems about 57% "faster" (GPU bound cases don't quite double the fps from the looks of it), I can't notice the input lag increase tbh. In the same scene without frame generation and reflex disabled input lag is worse than frame generation enabled by about 12ms. My first reaction to it is definitely that it looks smoother but I guess I'm expecting the input lag to match what I'm looking at, feels a bit weird in this game.
Oh, did not know that the experience overlay shows the latency before the signal is sent to the display.
In Spider-Man FG in 4K with Raytracing is 5ms higher than FG off without Reflex - so 16ms instead of 11ms. :D Performance can be up 2x in CPU limited scenarios.

In Bright Memory with Raytracing in 4K FG is 11ms higher - 62ms to 51ms (100FPS!) without Reflex. With DLSS quality and FG the latency drops to 44ms and with performance it is 35ms.
 
Using the driver v-sync setting for Spider-Man increses "latency" by 3x times - up to 7xms over 2xms without. We really need higher hz displays. Frame pacing in Spider-Man is so uneven that is shoots over my 144hz limit with average 100fps...
 
Using the driver v-sync setting for Spider-Man increses "latency" by 3x times - up to 7xms over 2xms without. We really need higher hz displays. Frame pacing in Spider-Man is so uneven that is shoots over my 144hz limit with average 100fps...

Can you use a frame rate limiter to limit the framerate to say 70fps and then use DLSS3 to double it? Obviously that's probably a waste if you can get to near 144fps without frame generation anyway. Although if you throw on max DLDSR I assume you could restrain even the 4090 enough.
 
Back
Top