AMD FSR antialiasing discussion

  • Thread starter Deleted member 90741
  • Start date
Can it be set at dynamic quality, targeting a fixed 60fps?
In addition to fixed scaling, FSR may be used in “arbitrary scaling” mode, whereby any area scale factor between 1x and 4x is supported. This mode is typically used for Dynamic Resolution Scaling, whereby source resolution is determined by a fixed performance budget to achieve a minimum frame rate.
 
Guru3D concerns coincide with my thoughts ...

Make no mistake, though; FSR as an implementation does work well! And it is going to help exactly in situations with Raytracing performance quite a lot. But think about what I write here; the industry is introducing technologies like raytracing to gain closer to reality image quality, but then is lowering your game image quality to boost your performance. This I find to be a contradiction and the dilemma I am fighting while writing this.

It was not any different for NVIDIA in the early stages of DLSS. DLSS started out as a deep learning algorithm from the first version, however, got so much lot better with DLSS 2.0 as NVIDIA can train their algorithms. AMD's Super Resolution is a purely spatial upscaler, and it's questionable how far you can get with that other than changing the input resolution and thus lower performance. AMD's technology does not feed itself with more information than an original image and a depth map. DLSS can make use of thousands of images to train from. In that mindset, FSR will be an intermediary solution to fill a performance gap at a cost; we feel AMD needs to embrace deep learning. In the end, we feel the Ultra quality mode is not bad at all, the real thesis here however goes like this; is not bad at all good enough?
 
AMD's Super Resolution is a purely spatial upscaler

Well, there you go. Atleast it'l work across any GPU etc, since theres no need for hardware (tensor etc) for FSR.
 
So.. now I can say the leaked list checks out.

I don't recommend reading the guru3d article. They're simply wrong in a bunch of statements, to the point of calling FSR "a temporal scaler and those have a reputation for compromises". For at least one example (Anno 1800) it's showing the wrong images to compare quality, by calling "ultra mode" to what is actually the "performance mode". Along with some questionable statements like "Terminator is a game that doesn't look good and that's probably why FSR does a good job here". All around it seems like an extremely biased article.

The TPU one seems nice. I'm off to read a couple more.
 
I don't recommend reading the guru3d article. They're simply wrong in a bunch of statements
I recommend "reading" it simply because they provide 4K screenshots of all modes for comparison.
People can use their own eyes and make their own statements from that.
For example:
https://img.guru3d.com/compare/fsr/godfall/native.jpg
https://img.guru3d.com/compare/fsr/godfall/ultraquality.jpg

TPU also does 50%/75% native to FSR comparison which is kinda cool - FSR manages to look better than 75% native most of the time.
 
Test AMD FidelityFX Super Resolution - Sprawdzamy wydajność i jakość obrazu. Czy to realna konkurencja dla NVIDIA DLSS? | PurePC.pl

All three games tested: Godfall, Terminator: Resistance and The Riftbreaker, present just as bad image quality. At 2560x1440 pixels it is not much better and actually only in Ultra Quality mode. Only in 4K resolution, the best preset FidelityFX Super Resolution gives the best results, but still not everywhere. Looking through the screenshots and playing I had the impression that FSR Ultra Quality in 4K is best presented in terminator. If I were to mention among these games the title that generally has the worst implemented FSR, I would point to Godfall, the title that AMD used to promote FidelityFX Super Resolution at Computex. Yes, the irony of fate.

As with similar techniques for reconstructing an image from a lower resolution, we can count on an increase in performance. However, much also depends on the selected resolution in the game. In Full HD, for example, both Godfall and Terminator don't give a big jump in performance on an AMD Radeon RX 6800M card. In return, we get, to put it mildly, a broken image. Only The Riftbraker offers a noticeable increase in FPS in the 1920x1080. The higher the resolution set in the game, the better the FSR results in frame rates. Looking at image quality, especially in Godfall and The Riftbreaker, however, I am not convinced that for such a jump in performance it is worth deciding on a noticeable deterioration in image quality. In Terminator, it was only in 4K resolution that we managed to offer quite good quality and only in Ultra Quality mode. It seems that a lot also depends on the degree of implementation by a given manufacturer.
 
Last edited:
Here's what guru3d is doing:


anno-compare.jpg




Not to mention stuff like this:

We don't have to write pages full on AMD's new feature, FSR as a technology works. If you need faster framerates and apply it, this will absolutely help you boost your framerates. However it's a temporal scaler, and these have been around for years. These have built a reputation with compromises on image quality at some point in games.


And finally, they couldn't help themselves from going on a tangent against AMD for "ignoring" machine learning in the conclusions. As if the implementation technology ever really mattered to the end user.

In the end, it's a publication I advise to avoid.
 
Back
Top