Nvidia DLSS 3 antialiasing discussion

The first one they cap the frame rate at 120 fps so DLSS3 is running at a real 60 fps while the others run at a real 120.
I think that seems like a good test to compare 120fps to "120fps" to see how they subjectively feel compared to each other actually playing it. The whole selling point is higher fps but if those extra frames feel and look weird in comparison, it might not be worth it to some. After all, DLSS3 shouldn't exist solely for Nvidia fan-morons to point at fps graphs and jerk off.
 
I think that seems like a good test to compare 120fps to "120fps" to see how they subjectively feel compared to each other actually playing it. The whole selling point is higher fps but if those extra frames feel and look weird in comparison, it might not be worth it to some. After all, DLSS3 shouldn't exist solely for Nvidia fan-morons to point at fps graphs and jerk off.
Entirely agreed. If you already have the brute-force power to get these framerates anyway, why are you using DLSS3 with the frame generator? The folks who want that framerate will be the folks who can't get there with brute force. Seems like a great comparison to me: how close can you get to "the real thing" using only these new rendering tricks?

And if the answer is the framerate is higher but the visual quality is lower, then maybe a better alternative is simply turning down other visual fidelity settings in the game to pick up the extra frames.
 
Entirely agreed. If you already have the brute-force power to get these framerates anyway, why are you using DLSS3 with the frame generator? The folks who want that framerate will be the folks who can't get there with brute force. Seems like a great comparison to me: how close can you get to "the real thing" using only these new rendering tricks?
I'm not against the idea of these extra generated frames, at least when it's needed to achieve higher frames for faster monitors etc. If there are gamers who like the experience and feel the quality is there for them, then great. It's a useful feature coming from hardware that has existed for other purposes.

I just think the testing methodology made a lot of sense.
 
I'm not against the idea of these extra generated frames, at least when it's needed to achieve higher frames for faster monitors etc. If there are gamers who like the experience and feel the quality is there for them, then great. It's a useful feature coming from hardware that has existed for other purposes.

I just think the testing methodology made a lot of sense.
Still agreed entirely.
 
I stepped off the high end PC hardware treadmill many years ago, so I mostly never pay attention to all the latest GPU features.
But something caught my eye when I was searching for a different topic entirely.
THIS "framerate multiplier" in the game Combat Arms (in a blog post from 2016 and all the way at the bottom) sounds like the same kind of thing as DLSS3. They boost framerate from a native 120fps up to 360fps, which they call a 4x increase (since the actual rendered frames drops to 90 due to overhead). They used a GTX 1070 in their example. They point out distortion when movement is rapid.
Is DLSS3 basically the same thing, except that it has dedicated hardware to correct distortion and artifacts, and, presumably, to provide these extra frames at much lower cost?
How many other games have done this? I only know of The Force Unleashed 2, and that was never implemented in the final game.
 
I stepped off the high end PC hardware treadmill many years ago, so I mostly never pay attention to all the latest GPU features.
But something caught my eye when I was searching for a different topic entirely.
THIS "framerate multiplier" in the game Combat Arms (in a blog post from 2016 and all the way at the bottom) sounds like the same kind of thing as DLSS3. They boost framerate from a native 120fps up to 360fps, which they call a 4x increase (since the actual rendered frames drops to 90 due to overhead). They used a GTX 1070 in their example. They point out distortion when movement is rapid.
Is DLSS3 basically the same thing, except that it has dedicated hardware to correct distortion and artifacts, and, presumably, to provide these extra frames at much lower cost?
How many other games have done this? I only know of The Force Unleashed 2, and that was never implemented in the final game.

All PSVR titles boost from 60 to 120, MetaQuest boosts from 45 to 90, and their new "Pro" one boosts from 60 to 120.

It's not been very popular otherwise due to the issues a lot of games encounter; but it's necessary in VR titles that can't hit high enough framerates otherwise, so, you know, people don't vom and such.
 
I think that seems like a good test to compare 120fps to "120fps" to see how they subjectively feel compared to each other actually playing it. The whole selling point is higher fps but if those extra frames feel and look weird in comparison, it might not be worth it to some. After all, DLSS3 shouldn't exist solely for Nvidia fan-morons to point at fps graphs and jerk off.
But you dont get the same frames without it. I'm playing Loopmancer on a 4090. So in 3440x1440 with Raytracing i get ~100fps. With FG the fps can get up to the monitor limit of 170hz. ~70% more frames means better clearity on my 170Hz fast IPS panel. In Spider-Man fps are doubled in CPU limited scenarios. So they go up from 70, 80 to 140FPS to 160FPS.

There are many applications for FG which will help to make use of high speed panels like fast IPS and OLEDs. Yet people still think they can feel 10ms additional latency...
 
But you dont get the same frames without it. I'm playing Loopmancer on a 4090. So in 3440x1440 with Raytracing i get ~100fps. With FG the fps can get up to the monitor limit of 170hz. ~70% more frames means better clearity on my 170Hz fast IPS panel. In Spider-Man fps are doubled in CPU limited scenarios. So they go up from 70, 80 to 140FPS to 160FPS.

There are many applications for FG which will help to make use of high speed panels like fast IPS and OLEDs. Yet people still think they can feel 10ms additional latency...
I'm not sure anything you said hasn't been stated ad nauseam since DLSS3 announcement. If you really like the technology, then that's great, which as I said it might be worth it to some, not to others. I was mostly speaking about the testing methodology making sense to try and compare native vs FG. I wasn't stating the technology is useless that needed Nvidia folks to come to their defense again.

I'll quote my 2nd reply:
I'm not against the idea of these extra generated frames, at least when it's needed to achieve higher frames for faster monitors etc. If there are gamers who like the experience and feel the quality is there for them, then great. It's a useful feature coming from hardware that has existed for other purposes.

I just think the testing methodology made a lot of sense.
 
I can not simply understand why ANYONE would give a damn about DLSS3 in that game, when it CONSTANTLY stutters and hitches due to shader compilation.

I wonder if DLSS 3 can help with this? i.e. what of you lock your frame rate at a level comfortably below your CPU's limits but then use DLSS 3 to get it back up to something reasonable. Would all that idle CPU power help to reduce the SCS?
 
~70% more frames means better clearity on my 170Hz fast IPS panel. In Spider-Man fps are doubled in CPU limited scenarios. So they go up from 70, 80 to 140FPS to 160FPS.
You can improve "clarity" through other means as well:
 
But you dont get the same frames without it. I'm playing Loopmancer on a 4090. So in 3440x1440 with Raytracing i get ~100fps. With FG the fps can get up to the monitor limit of 170hz. ~70% more frames means better clearity on my 170Hz fast IPS panel. In Spider-Man fps are doubled in CPU limited scenarios. So they go up from 70, 80 to 140FPS to 160FPS.

There are many applications for FG which will help to make use of high speed panels like fast IPS and OLEDs. Yet people still think they can feel 10ms additional latency...
And the user with a 3060/ti trying to reach from 30 to 45 or 60fps would have the same experience? Doubtful, lower native frame rate with fast moving images from 1 frame to another has increased distance and higher margin/probability of artifacts to occur in the interpolated frames and the latency is already much worse than your example so it doesn't directly translate to the majority of users since most have more affordable mainstream hardware that already struggle to reach solid 60fps to begin with at 1080p in most titles with already lowest RT settings enabled.
 
Nvidia Optical Flow SDK v4.0 is out.
  • NVIDIA Optical Flow engine-assisted video frame rate up conversion (NvOFFRUC) library and API
  • Windows Subsystem for Linux (WSL) support
  • ADA - 80% reduction in the number of SW kickoffs from Display Driver to GPU.
  • ADA - Optimized Compute engine with 80% reduction in blits resulting in lesser number of context switches
  • ADA - 2.5x performance improvement over Ampere generation GPUs
  • ADA - 15% quality improvement .
 
After tons of dirty work with NvOFFRUC+ffmpeg+Avidemux+Resolve, I've made a short demo video.:LOL:
You can see the glitches in interpolated frames, especially in her feets. Watch the video in 1080p60.
Wait what, it actually manages to create artifacts even when interpolating simple video?!
 
And the user with a 3060/ti trying to reach from 30 to 45 or 60fps would have the same experience? Doubtful, lower native frame rate with fast moving images from 1 frame to another has increased distance and higher margin/probability of artifacts to occur in the interpolated frames and the latency is already much worse than your example so it doesn't directly translate to the majority of users since most have more affordable mainstream hardware that already struggle to reach solid 60fps to begin with at 1080p in most titles with already lowest RT settings enabled.

Has there been any analysis done on FG quality at various raw fps values?
 
Back
Top