They are the only publication whose FSR review results go against what I see in screenshots and videos with my own eyes. Cute or not.
Which portion of the results do you contest, specifically?
They are the only publication whose FSR review results go against what I see in screenshots and videos with my own eyes. Cute or not.
The one where they manage to not see any difference between FSR and native in Godfall and Riftbreaker for example.Which portion of the results do you contest, specifically?
the purpose is to show image quality differences - you lock the framerate so that the frame sample is the exact same time step.Are those FPS normalized to 60/120FPS there? Why?
I can't know what you are exactly referring to without exact timestamps, so that there are no misunderstandings.The one where they manage to not see any difference between FSR and native in Godfall and Riftbreaker for example.
the purpose is to show image quality differences - you lock the framerate so that the frame sample is the exact same time step.
What do you mean?They are the only publication whose FSR review results go against what I see in screenshots and videos with my own eyes. Cute or not.
I can't know what you are exactly referring to without exact timestamps, so that there are no misunderstandings.
FSR holds up quite well when using the Ultra Quality or Quality modes at 4K. Without zooming in, these modes look similar to native rendering in Godfall, which is the kind of result you want to achieve.
It was Anno not Riftbreaker, sorry. But the difference in Anno is even more apparent than in Riftbreaker so that's an even weirder take.I was impressed with FSR’s ability to preserve fine detail with the Ultra Quality and Quality modes. Image quality also holds up well in a game like Anno 1800, which is another title that has a lot of fine detail in its native presentation, nice and sharp overall.
All reviews but HUB have managed to see and say something about the difference between native and FSR.What do you mean?
Of all I've seen, DF is alone in their conclusions. They're the odd ones out, not HUB.
Timestamp?The one where they manage to not see any difference between FSR and native in Godfall and Riftbreaker for example.
Nowhere does it say you can't spot the difference. But I'll be damned if the difference is great enough to not warrant the "similar" adjective they used.It was Anno not Riftbreaker, sorry. But the difference in Anno is even more apparent than in Riftbreaker so that's an even weirder take.
https://www.techspot.com/review/2277-amd-fsr-analysis-benchmark/
All reviews but HUB have managed to see and say something about the difference between native and FSR.
This review should satisfy you as they use an ASUS ROG Strix G15 Advantage Edition, which features an AMD Radeon RX 6800M graphics card.Will there be a performance oriented analysis from DF?
Image quality acceptance depends on the gains and in which hardware those gains are being gained. No one really cares that a 6900XT or an RTX3090 jumps from 50 to 70fps on 4K with ultra quality, but a jump from 25 to 60fps in performance mode matters more than the image quality for an iGPU.
All three games tested: Godfall, Terminator: Resistance and The Riftbreaker, present just as bad image quality. At 2560x1440 pixels it is not much better and actually only in Ultra Quality mode. Only in 4K resolution, the best preset FidelityFX Super Resolution gives the best results, but still not everywhere. Looking through the screenshots and playing I had the impression that FSR Ultra Quality in 4K is best presented in terminator. If I were to mention among these games the title that generally has the worst implemented FSR, I would point to Godfall, the title that AMD used to promote FidelityFX Super Resolution at Computex. Yes, the irony of fate.
As with similar techniques for reconstructing an image from a lower resolution, we can count on an increase in performance. However, much also depends on the selected resolution in the game. In Full HD, for example, both Godfall and Terminator don't give a big jump in performance on an AMD Radeon RX 6800M card. In return, we get, to put it mildly, a broken image. Only The Riftbraker offers a noticeable increase in FPS in the 1920x1080. The higher the resolution set in the game, the better the FSR results in frame rates. Looking at image quality, especially in Godfall and The Riftbreaker, however, I am not convinced that for such a jump in performance it is worth deciding on a noticeable deterioration in image quality. In Terminator, it was only in 4K resolution that we managed to offer quite good quality and only in Ultra Quality mode. It seems that a lot also depends on the degree of implementation by a given manufacturer.
This review should satisfy you as they use an ASUS ROG Strix G15 Advantage Edition, which features an AMD Radeon RX 6800M graphics card.
Similar means that you can't spot the difference. And they've even zoomed in apparently. Feel free to argue semantics without me.Nowhere does it say you can't spot the difference. But I'll be damned if the difference is great enough to not warrant the "similar" adjective they used.
Well Digital Foundry tend to focus on the image quality and technical aspects of gaming rather than performance. Which is fine, there are many different outlets for reviews that vary a lot in their focus.I know where the performance is. My question was to know if DF had plans in investigating the other side of the coin.
The first quote you brought literally says "if you don't zoom in".Similar means that you can't spot the difference. And they've even zoomed in apparently. Feel free to argue semantics without me.
I'm not sure I understand this, the only side of the coin you should be measuring is the image quality.I know where the performance is. My question was to know if DF had plans in investigating the other side of the coin.
I see the differences clearly without zooming in. Zooming in highlights them making them impossible to miss. If or not.The first quote you brought literally says "if you don't zoom in".
IF
Not that they didn't.
Those who write such things are being intentionally obtuse. Which is exactly why I don't consider HUB a reliable source anymore.There is no arguing semantics here, it doesn't get clearer unless you're being intentionally obtuse.
I'm not sure I understand this, the only side of the coin you should be measuring is the image quality.
1660p vs 4K Ultra FSR; 1660p will have a better frame time. You only now need to compare the image quality.
540p vs 1080p FSR Performance: once again, 540p will have better frame times, you now only need to compare the image quality.
You only need to compare the base resolution that FSR uses to upscale from vs FSR. The frame time will be impacted by FSR, the user will need to decide based on the image quality if it's worth it.
Vs a bilinear upscale, it appears to make sense. But there are other upsampling methods FSR will compete against and they may produce better image quality and have fairly identical performance metrics.
GamersNexus: "Ultra quality looks close enough to native 4K that the difference might not be immediately obvious"I see the differences clearly without zooming in. Zooming in highlights them making them impossible to miss. If or not.
Those who write such things are being intentionally obtuse. Which is exactly why I don't consider HUB a reliable source anymore.
FSR doesn't exist to downgrade quality. It exists to increase performance at the cost of quality.
...I'm not sure what is going on with the interpretation of my question.