AMD FSR antialiasing discussion

  • Thread starter Deleted member 90741
  • Start date
So many reviews and analysis later, its quite clear that his wasn't what some (or even many) where expecting, perhaps not AMDs fault, but people around the internet thinking it would be anything close to DLSS which it aint. Its a substitute for TAAU where that tech isnt implemented into engines.
 
the purpose is to show image quality differences - you lock the framerate so that the frame sample is the exact same time step.

Will there be a performance oriented analysis from DF?

Image quality acceptance depends on the gains and in which hardware those gains are being gained. No one really cares that a 6900XT or an RTX3090 jumps from 50 to 70fps on 4K with ultra quality, but a jump from 25 to 60fps in performance mode matters more than the image quality for an iGPU.
 
I can't know what you are exactly referring to without exact timestamps, so that there are no misunderstandings.
FSR holds up quite well when using the Ultra Quality or Quality modes at 4K. Without zooming in, these modes look similar to native rendering in Godfall, which is the kind of result you want to achieve.
I was impressed with FSR’s ability to preserve fine detail with the Ultra Quality and Quality modes. Image quality also holds up well in a game like Anno 1800, which is another title that has a lot of fine detail in its native presentation, nice and sharp overall.
It was Anno not Riftbreaker, sorry. But the difference in Anno is even more apparent than in Riftbreaker so that's an even weirder take.
https://www.techspot.com/review/2277-amd-fsr-analysis-benchmark/

What do you mean?
Of all I've seen, DF is alone in their conclusions. They're the odd ones out, not HUB.
All reviews but HUB have managed to see and say something about the difference between native and FSR.
 
It was Anno not Riftbreaker, sorry. But the difference in Anno is even more apparent than in Riftbreaker so that's an even weirder take.
https://www.techspot.com/review/2277-amd-fsr-analysis-benchmark/


All reviews but HUB have managed to see and say something about the difference between native and FSR.
Nowhere does it say you can't spot the difference. But I'll be damned if the difference is great enough to not warrant the "similar" adjective they used.
 
Will there be a performance oriented analysis from DF?

Image quality acceptance depends on the gains and in which hardware those gains are being gained. No one really cares that a 6900XT or an RTX3090 jumps from 50 to 70fps on 4K with ultra quality, but a jump from 25 to 60fps in performance mode matters more than the image quality for an iGPU.
This review should satisfy you as they use an ASUS ROG Strix G15 Advantage Edition, which features an AMD Radeon RX 6800M graphics card.

All three games tested: Godfall, Terminator: Resistance and The Riftbreaker, present just as bad image quality. At 2560x1440 pixels it is not much better and actually only in Ultra Quality mode. Only in 4K resolution, the best preset FidelityFX Super Resolution gives the best results, but still not everywhere. Looking through the screenshots and playing I had the impression that FSR Ultra Quality in 4K is best presented in terminator. If I were to mention among these games the title that generally has the worst implemented FSR, I would point to Godfall, the title that AMD used to promote FidelityFX Super Resolution at Computex. Yes, the irony of fate.

As with similar techniques for reconstructing an image from a lower resolution, we can count on an increase in performance. However, much also depends on the selected resolution in the game. In Full HD, for example, both Godfall and Terminator don't give a big jump in performance on an AMD Radeon RX 6800M card. In return, we get, to put it mildly, a broken image. Only The Riftbraker offers a noticeable increase in FPS in the 1920x1080. The higher the resolution set in the game, the better the FSR results in frame rates. Looking at image quality, especially in Godfall and The Riftbreaker, however, I am not convinced that for such a jump in performance it is worth deciding on a noticeable deterioration in image quality. In Terminator, it was only in 4K resolution that we managed to offer quite good quality and only in Ultra Quality mode. It seems that a lot also depends on the degree of implementation by a given manufacturer.
 
Nowhere does it say you can't spot the difference. But I'll be damned if the difference is great enough to not warrant the "similar" adjective they used.
Similar means that you can't spot the difference. And they've even zoomed in apparently. Feel free to argue semantics without me.
 
I know where the performance is. My question was to know if DF had plans in investigating the other side of the coin.
Well Digital Foundry tend to focus on the image quality and technical aspects of gaming rather than performance. Which is fine, there are many different outlets for reviews that vary a lot in their focus.
 
Similar means that you can't spot the difference. And they've even zoomed in apparently. Feel free to argue semantics without me.
The first quote you brought literally says "if you don't zoom in".
IF
Not that they didn't.

There is no arguing semantics here, it doesn't get clearer unless you're being intentionally obtuse.
 
I know where the performance is. My question was to know if DF had plans in investigating the other side of the coin.
I'm not sure I understand this, the only side of the coin you should be measuring is the image quality.
1660p vs 4K Ultra FSR; 1660p will have a better frame time. You only now need to compare the image quality.
540p vs 1080p FSR Performance: once again, 540p will have better frame times, you now only need to compare the image quality.

You only need to compare the base resolution that FSR uses to upscale from vs FSR. The frame time will be impacted by FSR, the user will need to decide based on the image quality if it's worth it.

Vs a bilinear upscale, it appears to make sense. But there are other upsampling methods FSR will compete against and they may produce better image quality and have fairly identical performance metrics.
 
The first quote you brought literally says "if you don't zoom in".
IF
Not that they didn't.
I see the differences clearly without zooming in. Zooming in highlights them making them impossible to miss. If or not.

There is no arguing semantics here, it doesn't get clearer unless you're being intentionally obtuse.
Those who write such things are being intentionally obtuse. Which is exactly why I don't consider HUB a reliable source anymore.
 
I'm not sure I understand this, the only side of the coin you should be measuring is the image quality.
1660p vs 4K Ultra FSR; 1660p will have a better frame time. You only now need to compare the image quality.
540p vs 1080p FSR Performance: once again, 540p will have better frame times, you now only need to compare the image quality.

You only need to compare the base resolution that FSR uses to upscale from vs FSR. The frame time will be impacted by FSR, the user will need to decide based on the image quality if it's worth it.

Vs a bilinear upscale, it appears to make sense. But there are other upsampling methods FSR will compete against and they may produce better image quality and have fairly identical performance metrics.

FSR doesn't exist to downgrade quality. It exists to increase performance at the cost of quality.

...I'm not sure what is going on with the interpretation of my question.
 
I see the differences clearly without zooming in. Zooming in highlights them making them impossible to miss. If or not.


Those who write such things are being intentionally obtuse. Which is exactly why I don't consider HUB a reliable source anymore.
GamersNexus: "Ultra quality looks close enough to native 4K that the difference might not be immediately obvious"
LTT: "Compared to native, Ultra looks nearly indistinguishable"
TPU: "I'd say FSR Ultra Quality is "almost native", even FSR Quality is good enough not to notice much of a difference during actual gameplay."
Hothardware: "Without blowing the image up to 2x to find all the hard edges, or swapping back and forth between images, we're actually quite hard-pressed to find glaring differences. The stairs are a little more blurry, and perhaps there's not quite as much detail in the character's hair, but in motion it's basically indistinguishable."

You've made your mind about HUB before even reading their piece.
 
FSR doesn't exist to downgrade quality. It exists to increase performance at the cost of quality.

...I'm not sure what is going on with the interpretation of my question.

You're looking at it backwards I think. FSR takes an image from a base resolution and scales it upwards to a target resolution. It is not doing this in the reverse. So there is no performance gain at the cost of quality. The algorithms eat performance to increase quality, in this case, the quality is to sample the lower resolution and extract detail for a higher resolution.

The base resolution for 4K ultra is 1660p. You if you want to look at performance numbers, you compare 1660p vs 4K FSR Ultra in terms of it's image quality. ie what image quality is gained by losing some performance and to use FSR to bring the image to 4K
If you compare 4K Native vs 4K FSR and ignore image quality, then you are missing the point of what FSR is trying to achieve.
 
Back
Top