I can not simply understand why ANYONE would give a damn about DLSS3 in that game, when it CONSTANTLY stutters and hitches due to shader compilation.
I can not simply understand why ANYONE would give a damn about DLSS3 in that game, when it CONSTANTLY stutters and hitches due to shader compilation.
I think that seems like a good test to compare 120fps to "120fps" to see how they subjectively feel compared to each other actually playing it. The whole selling point is higher fps but if those extra frames feel and look weird in comparison, it might not be worth it to some. After all, DLSS3 shouldn't exist solely for Nvidia fan-morons to point at fps graphs and jerk off.The first one they cap the frame rate at 120 fps so DLSS3 is running at a real 60 fps while the others run at a real 120.
Entirely agreed. If you already have the brute-force power to get these framerates anyway, why are you using DLSS3 with the frame generator? The folks who want that framerate will be the folks who can't get there with brute force. Seems like a great comparison to me: how close can you get to "the real thing" using only these new rendering tricks?I think that seems like a good test to compare 120fps to "120fps" to see how they subjectively feel compared to each other actually playing it. The whole selling point is higher fps but if those extra frames feel and look weird in comparison, it might not be worth it to some. After all, DLSS3 shouldn't exist solely for Nvidia fan-morons to point at fps graphs and jerk off.
I'm not against the idea of these extra generated frames, at least when it's needed to achieve higher frames for faster monitors etc. If there are gamers who like the experience and feel the quality is there for them, then great. It's a useful feature coming from hardware that has existed for other purposes.Entirely agreed. If you already have the brute-force power to get these framerates anyway, why are you using DLSS3 with the frame generator? The folks who want that framerate will be the folks who can't get there with brute force. Seems like a great comparison to me: how close can you get to "the real thing" using only these new rendering tricks?
Still agreed entirely.I'm not against the idea of these extra generated frames, at least when it's needed to achieve higher frames for faster monitors etc. If there are gamers who like the experience and feel the quality is there for them, then great. It's a useful feature coming from hardware that has existed for other purposes.
I just think the testing methodology made a lot of sense.
I stepped off the high end PC hardware treadmill many years ago, so I mostly never pay attention to all the latest GPU features.
But something caught my eye when I was searching for a different topic entirely.
THIS "framerate multiplier" in the game Combat Arms (in a blog post from 2016 and all the way at the bottom) sounds like the same kind of thing as DLSS3. They boost framerate from a native 120fps up to 360fps, which they call a 4x increase (since the actual rendered frames drops to 90 due to overhead). They used a GTX 1070 in their example. They point out distortion when movement is rapid.
Is DLSS3 basically the same thing, except that it has dedicated hardware to correct distortion and artifacts, and, presumably, to provide these extra frames at much lower cost?
How many other games have done this? I only know of The Force Unleashed 2, and that was never implemented in the final game.
But you dont get the same frames without it. I'm playing Loopmancer on a 4090. So in 3440x1440 with Raytracing i get ~100fps. With FG the fps can get up to the monitor limit of 170hz. ~70% more frames means better clearity on my 170Hz fast IPS panel. In Spider-Man fps are doubled in CPU limited scenarios. So they go up from 70, 80 to 140FPS to 160FPS.I think that seems like a good test to compare 120fps to "120fps" to see how they subjectively feel compared to each other actually playing it. The whole selling point is higher fps but if those extra frames feel and look weird in comparison, it might not be worth it to some. After all, DLSS3 shouldn't exist solely for Nvidia fan-morons to point at fps graphs and jerk off.
I'm not sure anything you said hasn't been stated ad nauseam since DLSS3 announcement. If you really like the technology, then that's great, which as I said it might be worth it to some, not to others. I was mostly speaking about the testing methodology making sense to try and compare native vs FG. I wasn't stating the technology is useless that needed Nvidia folks to come to their defense again.But you dont get the same frames without it. I'm playing Loopmancer on a 4090. So in 3440x1440 with Raytracing i get ~100fps. With FG the fps can get up to the monitor limit of 170hz. ~70% more frames means better clearity on my 170Hz fast IPS panel. In Spider-Man fps are doubled in CPU limited scenarios. So they go up from 70, 80 to 140FPS to 160FPS.
There are many applications for FG which will help to make use of high speed panels like fast IPS and OLEDs. Yet people still think they can feel 10ms additional latency...
I'm not against the idea of these extra generated frames, at least when it's needed to achieve higher frames for faster monitors etc. If there are gamers who like the experience and feel the quality is there for them, then great. It's a useful feature coming from hardware that has existed for other purposes.
I just think the testing methodology made a lot of sense.
I can not simply understand why ANYONE would give a damn about DLSS3 in that game, when it CONSTANTLY stutters and hitches due to shader compilation.
You can improve "clarity" through other means as well:~70% more frames means better clearity on my 170Hz fast IPS panel. In Spider-Man fps are doubled in CPU limited scenarios. So they go up from 70, 80 to 140FPS to 160FPS.
You can combine both. Use 60FPS as a basis, generate additional 60FPS and activate BFI.You can improve "clarity" through other means as well:
Blur Busters TestUFO Motion Tests
Blur Busters UFO Motion Tests with ghosting test, 30fps vs 60fps, 120hz vs 144hz vs 240hz, PWM test, motion blur test, judder test, benchmarks, and more.www.testufo.com
Which makes the errors in the frame generation more "clear" than without I suppose.You can combine both. Use 60FPS as a basis, generate additional 60FPS and activate BFI.
You can combine both. Use 60FPS as a basis, generate additional 60FPS and activate BFI.
And the user with a 3060/ti trying to reach from 30 to 45 or 60fps would have the same experience? Doubtful, lower native frame rate with fast moving images from 1 frame to another has increased distance and higher margin/probability of artifacts to occur in the interpolated frames and the latency is already much worse than your example so it doesn't directly translate to the majority of users since most have more affordable mainstream hardware that already struggle to reach solid 60fps to begin with at 1080p in most titles with already lowest RT settings enabled.But you dont get the same frames without it. I'm playing Loopmancer on a 4090. So in 3440x1440 with Raytracing i get ~100fps. With FG the fps can get up to the monitor limit of 170hz. ~70% more frames means better clearity on my 170Hz fast IPS panel. In Spider-Man fps are doubled in CPU limited scenarios. So they go up from 70, 80 to 140FPS to 160FPS.
There are many applications for FG which will help to make use of high speed panels like fast IPS and OLEDs. Yet people still think they can feel 10ms additional latency...
- NVIDIA Optical Flow engine-assisted video frame rate up conversion (NvOFFRUC) library and API
- Windows Subsystem for Linux (WSL) support
- ADA - 80% reduction in the number of SW kickoffs from Display Driver to GPU.
- ADA - Optimized Compute engine with 80% reduction in blits resulting in lesser number of context switches
- ADA - 2.5x performance improvement over Ampere generation GPUs
- ADA - 15% quality improvement .
After tons of dirty work with NvOFFRUC+ffmpeg+Avidemux+Resolve, I've made a short demo video.Nvidia Optical Flow SDK v4.0 is out.
Optical Flow SDK - Developer Program
developer.nvidia.com
Wait what, it actually manages to create artifacts even when interpolating simple video?!After tons of dirty work with NvOFFRUC+ffmpeg+Avidemux+Resolve, I've made a short demo video.
You can see the glitches in interpolated frames, especially in her feets. Watch the video in 1080p60.
And the user with a 3060/ti trying to reach from 30 to 45 or 60fps would have the same experience? Doubtful, lower native frame rate with fast moving images from 1 frame to another has increased distance and higher margin/probability of artifacts to occur in the interpolated frames and the latency is already much worse than your example so it doesn't directly translate to the majority of users since most have more affordable mainstream hardware that already struggle to reach solid 60fps to begin with at 1080p in most titles with already lowest RT settings enabled.