AMD FSR antialiasing discussion

  • Thread starter Deleted member 90741
  • Start date
That's not your average DOF, this DOF is a few pixel size convolutional kernel that works near the camera so you have to move the camera to a certain distance or angle to the character so that it would take any effect only on the character, but the rest of the image is not touched by the effect as it should be and is still the same for the slight DOF on/off.

Are you trying to tell me this is a valid comparison?

l49a2ge.png


That is Native 4k on left, 1080p TAAU upscaling on the right.

You can honestly say that isn't a broken comparison because the DOF on the character has been removed?
 
FSR is a DLSS competitor, of course, better or worse, we shall see. Even if DLSS 2.0 is better in some games, I'd wholeheartedly recommend you watching the "Pirates of Silicon Valley" movie. There is a scene where an overwhelmed Steve Jobs says to Bill Gates: "But...but..., my OS is better!". To which Gates replies: "That...doesn't matter!".


Maybe it's me but I see more future on FSR than DLSS.

Also agree with a comment on the video above:

I think you summed it up perfectly: "you were promised nothing and it's free". That's really always been the point here for gamers I think. A free update on a ten year old game provides an enormous performance boost on 10 year old hardware at higher resolutions. For free. There is just really no valid way to be negative about that.
 
FSR is a DLSS competitor, of course, better or worse, we shall see. Even if DLSS 2.0 is better in some games,

There's nothing really to wait and see about. DLSS is a superior solution full stop. I expect that will be the case in every single game we have the opportunity to put them head to head in, not just some.

Maybe it's me but I see more future on FSR than DLSS.

This would be a considerable shame, because it would be the victory and proliferation of an inferior technology over a superior one. By far the better outcome would be for AMD to create their own MLSS based solution which is genuinely competitive with DLSS and open source it. Or for Nvidia to open source DLSS. We ALL benefit from superior upscaling, no-one gains from seeing the state of the art solutions die out.

Don't get me wrong, I firmly believe that FSR has it's place, and I would love to see it as an option alongside DLSS in every game where TAAU or some other superior upscaling tech isn't available. The bottom line is that upscaling should be open to everyone, in every game, and FSR helps to enable that, hopefully in a big way, and that's a great thing, but the proliferation of DLSS (and comparable solutions) is still the ultimate goal.
 
@Cyan That HU video on FSR in DOTA2 is not really selling it very well. It actually looks pretty much useless for low end gpus. FSR looks pretty terrible when upscaling to 1080p. It starts to become useful on 1440p monitors. Looks like you have to be careful to make sure you're not actually losing performance on low end gpus. In terms of the last point about using FSR at 99% ... that's just dumb. Essentially that's just going to over-sharpen the image. Better to use AMD CAS, which is probably much cheaper.

I do think FSR has its place, but ironically it seems to be better for more capable pcs where you're driving a higher resolution display like 1440p or 4k. Mid-range to low-end gpus are just a bad match for those output resolutions (I suppose you can use a 4k monitor and output 1080p and get a decent image).
 
@pjbliverpool In that one FSR interview that was linked they basically said this was the first version of FSR and there would be other more complex versions, maybe for higher end GPUs. I think that pretty much confirms that even AMD knows that spatial-only upscaling is a dead end. I still think FSR is good to have and can fit nicely into a lot of games to help people get more out of their gpus. But I would not be shocked to see AMD open source something that's clearly more similar to TAAU, or even something with DirectML.
 
This would be a considerable shame, because it would be the victory and proliferation of an inferior technology over a superior one. By far the better outcome would be for AMD to create their own MLSS based solution which is genuinely competitive with DLSS and open source it. Or for Nvidia to open source DLSS. We ALL benefit from superior upscaling, no-one gains from seeing the state of the art solutions die out.

I think @Cyan meant exactly that as the future of FSR. Future iterations of FSR using temporal data and/or machine learning might have a more promising future if it gets closer in performance+IQ to DLSS2 because developers will be able to just port the code from the multiplatform titles on consoles and it works on all hardware, not just nvidia RTX cards.


@Cyan That HU video on FSR in DOTA2 is not really selling it very well. It actually looks pretty much useless for low end gpus.

I guess it depends on what you interpret as "low end GPU". Is the GTX 1660 a low end GPU?

dota2.png


With FSR the 1660 reaches close to 140 FPS at 4K, a 37% boost in performance.
And if there was DLSS2 for DOTA2, the 1660 wouldn't be able to run it, so for this GPU, FSR in this game is always better than DLSS which is useless for all GTX cards.
 
I think @Cyan meant exactly that as the future of FSR. Future iterations of FSR using temporal data and/or machine learning might have a more promising future if it gets closer in performance+IQ to DLSS2 because developers will be able to just port the code from the multiplatform titles on consoles and it works on all hardware, not just nvidia RTX cards.




I guess it depends on what you interpret as "low end GPU". Is the GTX 1660 a low end GPU?

View attachment 5612


With FSR the 1660 reaches close to 140 FPS at 4K, a 37% boost in performance.
And if there was DLSS2 for DOTA2, the 1660 wouldn't be able to run it, so for this GPU, FSR in this game is always better than DLSS which is useless for all GTX cards.

I don't disagree with any of that. 1660 is probably right on the edge of FSR being really useful. If you just want our old games to run better, then great it's a free upgrade. For newer games, you're getting to the point where 1080p low is right around that 60fps mark. You'll probably need reasonable performance at 1080p low native if you want any kind of reasonable image quality for FSR upscaling. When the real next-gen games start hitting it's probably dead. So 4k performance mode might not even cut it where you'd be dropping below 60fps. I don't know how good 1440p performance or balanced would look, but I guess if you're just trying to not spend money you'll do it even if it doesn't look good. I'm sure a lot of people will just run performance modes at 1080p or 1440p if they have to. Edit: I ran Apex Legends at 720p low so I could get 120+ fps on my 1660 super and it looked VERY BAD. I definitely would have used FSR, even though it still would have looked awful.
 
Are you trying to tell me this is a valid comparison?
Are you trying to tell me that the aggressive sharpenning in the FSR doesn't screw such comparisons in all reviews?

Take a look at the whole image more cautiously then:
FSR UQ https://images.eurogamer.net/2021/articles/2021-06-22-14-23/AM_3_002.png/EG11/format/jpg/quality/95
Native https://images.eurogamer.net/2021/articles/2021-06-22-14-24/AM_4_002.png/EG11/format/jpg/quality/95

It fuck ups the results to a point where people would do ridiculous claims in articles that a simple upscaling with 59% of pixels (i.e. tons of details losses) is better than Native.
Unlike DFs comparisons, this mismatch was done intentionally and affects not just one character's ass, but rather all tested games.

Following by your very own criteria, the whole comparisons with FSR in all reviews are very screwed to begin with because of the sharpenning being aggressively applied to a point where you can't no longer compare FSR with Native resolution where sharpenning is not applied.
You seem to be perfectly fine with such invalid comparisons, so yes I am totally OK with a little bit of blur on the character's ass in the DF's comparison (since this subtle blur doesn't even change any conclusions on these techniques).
Try to be consistent with your own statements first before accusing somebody of something.
 
Played a little bit with Terminator. Unlike Kingshunt RCAS cant be used outside of FSR but FSR can be "activated" with native rendering:
4K vs. 4K FSR UQ with RCAS: https://imgsli.com/NTkxMDY
4K 77% vs. 4K FSR UQ without RCAS: https://imgsli.com/NTkxMDg

Tried TAAU too and it is much better with the reconstruction of everything but isnt as stable as TAA and bilinear upscaling.

/edit: Did a quick comparision between 4K 50% TAAU and 4K FSR Performance wo RCAS so ignore the difference in sharpness: https://imgsli.com/NTkxNTE
 
Last edited:
There is a scene where an overwhelmed Steve Jobs says to Bill Gates: "But...but..., my OS is better!". To which Gates replies: "That...doesn't matter!".

Even that can be debated. I do not think Apple's OS is 'better' by any means than Windows. Quite the opposite.

And get loads of gameworks-like hate from fans who won't even try to understand why DLSS runs better on Nv h/w. Would be fun to watch but I don't know if it would actually help with DLSS proliferation much.

True, its hardware accelerated, which does speed up things. Its a combination of both.

I think @Cyan meant exactly that as the future of FSR. Future iterations of FSR using temporal data and/or machine learning might have a more promising future if it gets closer in performance+IQ to DLSS2 because developers will be able to just port the code from the multiplatform titles on consoles and it works on all hardware, not just nvidia RTX cards.

Exactly, but not on RDNA2 architectures they wont be able to match GPUs that have a hardwareblock for this to accelerate. PS5 is out of luck on this one aswell.

I guess it depends on what you interpret as "low end GPU". Is the GTX 1660 a low end GPU?

Cant call it high end atleast. GTX1060 launched early 2016, it was the then Pascals lowest offering. Its still not a bad GPU if you dont want the highest settings etc.
 
There's nothing really to wait and see about. DLSS is a superior solution full stop. I expect that will be the case in every single game we have the opportunity to put them head to head in, not just some.



This would be a considerable shame, because it would be the victory and proliferation of an inferior technology over a superior one. By far the better outcome would be for AMD to create their own MLSS based solution which is genuinely competitive with DLSS and open source it. Or for Nvidia to open source DLSS. We ALL benefit from superior upscaling, no-one gains from seeing the state of the art solutions die out.

Don't get me wrong, I firmly believe that FSR has it's place, and I would love to see it as an option alongside DLSS in every game where TAAU or some other superior upscaling tech isn't available. The bottom line is that upscaling should be open to everyone, in every game, and FSR helps to enable that, hopefully in a big way, and that's a great thing, but the proliferation of DLSS (and comparable solutions) is still the ultimate goal.
you are right, the technology is superior. You can say the same about GSync Ultimate, it's certainly better than even Freesync Premium Pro, which my monitor supports.

GSync Ultimate uses a chip and you wont experience stuttering when LFC kicks in, it's just better overall.

The issue here is how nVidia policies get in the way.

It has been demonstrated that DLSS 1.0 could be used on any nVidia card, yet it was limited to GTX 20XX series, it didnt need the tensor cores at all.

They did the same with Integer Scaling, my GTX 1080 and GTX 1060 3GB can't use it. It would do wonders on my native 1440p screen when I want to play certain games at 1080p on my GTX 1060 3GB.

Without integer scaling direct 1080p to 1440p upscaling is really really bad, your eyes bleed. I prefer to play at 1440p even if it means lower settings -which is a hassle-.

However they decided that it is a feature only for newer GPUs.

Imho, making that simple feature exclusive makes no sense. In the end I am going to buy a new GPU but not because of integer scaling, just because as efficient as they are, my current GPUs are showing their age.

Since I have no love for the colors of be it nVidia, AMD or Intel, I am just waiting to decide and buy the most efficient GPU for my tastes -1440p gaming at 165Hz-240Hz when possible, if not 60fps is okay-.

AMD is my favourite right now, their GPUs are the most efficient, but I can wait a few months and see what Intel has to offer.
 
Following by your very own criteria, the whole comparisons with FSR in all reviews are very screwed to begin with because of the sharpenning being aggressively applied to a point where you can't no longer compare FSR with Native resolution where sharpenning is not applied.
You seem to be perfectly fine with such invalid comparisons, so yes I am totally OK with a little bit of blur on the character's ass in the DF's comparison (since this subtle blur doesn't even change any conclusions on these techniques).
Try to be consistent with your own statements first before accusing somebody of something.

So just to be 100% clear

You are OK with the image quality from a bugged effect (DOF removed when using TAAU), but you are NOT ok with a 100% developer added sharpening from FSR?

DOF being removed is a bug. FSR sharpening the image is intended and set by the developer.

Personally I'm fine with developer intended effects, and not bugs. I guess you like bugs over features.
 
So just to be 100% clear

You are OK with the image quality from a bugged effect (DOF removed when using TAAU), but you are NOT ok with a 100% developer added sharpening from FSR?

DOF being removed is a bug. FSR sharpening the image is intended and set by the developer.

Personally I'm fine with developer intended effects, and not bugs. I guess you like bugs over features.

No, RCAS is not optional. It is part of the package and always forced on. Otherwise why would a developer oversharping an undersampled image and not the native one?!
 
No, RCAS is not optional. It is part of the package and always forced on. Otherwise why would a developer oversharping an undersampled image and not the native one?!

They can set the sharpening amount. It is an intended feature.

Again ignoring the big picture which is the DOF removed from TAAU being a bug
 
Like disabling DoF with TAAU? Maybe it is the intention of the engine provide to disable it like the FSR provider intents to oversharpen undersampled upscaling images.

No, its a bug and documented by the TAAU developer.

Out of curiosity, will the new TAA upscaling behave well with depth of field? Currently when you set r.TemporalAA.Upsampling=1 , most of the DOF just disappears.

So when r.TemporalAA.Upsampling=1, it basically forces r.DOF.Recombine.Quality=0 that looses the slight DOF convolution, and that is due to DiaphragmDOF.cpp’s bSupportsSlightOutOfFocus. There needs to have some changes in the handling of the slight out of convolution (about 5pixels and below) when doing temporal upsampling that I didn’t have time to come down to. And we were only using temporal upsampling on current gen consoles. Wasn’t a big deal back then because if your frame would need to be temporally upsampled, that probably meant you didn’t have the performance to run DOF’s slight out of focus… However we exactly ran into this issue for our Lumen in the Land of Nanite demo running on PS5, but it is still prototype and I’m not sure whether I’m gonna have this finished by 4.26’s release. But yeah given how temporal upsampling is going to become important, it’s definitely something to fix very high on the priority list.

https://forums.unrealengine.com/t/gen-5-temporal-anti-aliasing/152107/5

Its a bug that wasn't fixed because they figured if you are upscaling, you want more performance and DOF costs performance so it breaking wasn't a big deal as you'd likely want to disable it anyway.

But comparing broken DOF to enabled DOF and saying the DOF image looks worse is disingenuous as best.
 
Back
Top