AMD FSR antialiasing discussion

  • Thread starter Deleted member 90741
  • Start date
There is no "own post processing solution" in FSR for AA. FSR must be used on an AAed image which means that it must use whatever a game has for AA.
There is no specific PP for a upscaler in order for it to 'upscale'. You understood the context in which I written that. Under no circumstance does it mean some sort of explicit use of this mysterious AMD specific PP :LOL:. It's whatever the developer decides to use for it that brings the IQ inline as intended. Just as exampled in Necromunda Hired Gun.
 
There is no specific PP for a upscaler in order for it to 'upscale'. You understood the context in which I written that. Under no circumstance does it mean some sort of AMD specific PP. It's whatever the developer decides to use for it that brings the IQ inline as intended. Just as exampled in Necromunda Hired Gun.
Right. And most developers use TAA of varying quality and characteristics. Thus in most games FSR will rely on said TAA to antialias the image before upscaling. So what is your point? That we can't assess the quality of FSR in games with "bad" TAA implementations?

Necromunda Hired Gun: FSR enabled which disables TAA.
It doesn't disable TAA, it disables user option to disable it. Because you have to use AA for FSR to work correctly. TAA is always active with FSR.
DLSS does disable game's own TAA though because DLSS is also TAA in addition to upscaling.

I can only hope that we start to see a trend of comparing FSR/DLSS with their own implementation of PP which overrides in game AA as exampled in Necromunda Hired Gun.
There is no "FSR own implementation" of AA. FSR doesn't have any AA in it.

These results are using a 3080ti and 3090. Not Radeon Cards. So, is the result of grill mesh not rendering properly do to a driver issue? That is still an unknown right now.
It is a geometry LOD issue with objects LOD being tied to game's rendering resolution instead of being tied to the output resolution which is the correct way to do it if you're using resolution reconstruction like DLSS or TAAU. Radeon cards will show the same LOD if the game will render in the same resolution - i.e. DLSS Quality vs FSR Quality instead of Q vs UQ.
 
Last edited:
Right. And most developers use TAA of varying quality and characteristics. Thus in most games FSR will rely on said TAA to antialias the image before upscaling. So what is your point? That we can't assess the quality of FSR in games with "bad" TAA implementations?
This is a logical fallacy known as Begging the Question or Circular Argument. The answer was already addressed in my prior response.

It doesn't disable TAA, it disables user option to disable it. Because you have to use AA for FSR to work correctly. TAA is always active with FSR.
DLSS does disable game's own TAA though because DLSS is also TAA in addition to upscaling.
Really now? So that explains perfectly why TAA isn't grayed out in Avengers. Because click on the AA option in that game will "mess up" FSR.
:LOL:


There is no "FSR own implementation" of AA. FSR doesn't have any AA in it.
Another logical fallacy that doesn't address my response. "I can only hope that we start to see a trend of comparing FSR/DLSS with their own implementation of PP which overrides in game AA as exampled in Necromunda Hired Gun."
Looking for a particular trend in how IQ is compared in upscalers is not an indication AMD having a mysterious PP for FSR. It's implied that higher better/quality PP be used for FSR :LOL:



It is a geometry LOD issue with objects LOD being tied to game's rendering resolution instead of being tied to the output resolution which is the correct way to do it if you're using resolution reconstruction like DLSS or TAAU. Radeon cards will show the same LOD if the game will render in the same resolution - i.e. DLSS Quality vs FSR Quality instead of Q vs UQ.
That's a long winded way of agreeing that it more then likely is a nvidia driver issue more so then FSR. I don't see this on AMD GPUs.[/QUOTE]
 
Last edited:
Really now? So that explains perfectly why TAA isn't grayed out in Avengers. Because click on the AA option in that game will "mess up" FSR.
It's is up to the developers to control their game's options when integrating FSR (or DLSS). I'm just translating what AMD's FSR guide has in it.

Another logical fallacy that doesn't address my response. "I can only hope that we start to see a trend of comparing FSR/DLSS with their own implementation of PP which overrides in game AA as exampled in Necromunda Hired Gun."
Looking for a particular trend in how IQ is compared in upscalers is not an indication AMD having a mysterious PP for FSR. It's implied that higher better/quality PP be used for FSR :LOL:
So the point is as I've said to not assess the quality of FSR in games with "crap" AA. The problem is that nobody will use anything but what there already is in the game and thus FSR will have to work with that and we will check the results regardless. If this will push some devs to implement better AA by default then good for us - but I somewhat doubt this since it would defeat the key advantage of FSR in the first place - ease of integration.

That's a long winded way of agreeing that it more then likely is a nvidia driver issue more so then FSR. I don't see this on AMD GPUs.
Game's choice of geometry LODs is not an Nvidia driver issue.
 
Although I find your question disingenuous as I made it quite clear that TAA in Avengers was garbage. It did make me ponder on it further.
  • These results are using a 3080ti and 3090. Not Radeon Cards. So, is the result of grill mesh not rendering properly do to a driver issue? That is still an unknown right now.
Which API does it use? At least around launch RTX 30 had issues with FP16 precision FSR in D3D11 games, which according to AMD were NVIDIA driver issue
 
I just looked at all the games that I've played in the past year and a total of 1 of them supports DLSS (Cyberpunk 2077). So, as of the moment, DLSS is almost completely useless to me.

We'll see how much better FSR fares in getting integrated into games I actually play. Although if the quality doesn't improve I'm not sure how many titles I would use it in anyways.

What we need is something that is as easy to implement into games and can work in all games like FSR with the quality of DLSS (quality setting or higher), IMO. Not sure that's ever going to happen however.

Regards,
SB
 
Well, if the current Lovelace/RDNA3 leaks are true, maybe these technologies will be less salient with 2.5x-3x of current gen top offerings performance
 
Someone produced a video "emulation" of Native vs. FSR Quality on Steam Deck's 7" 1280*800 screen.
I guess the it's mostly valid for people with high density screens, like e.g. a 27" 4K panel, with the video in fullscreen.



Considering FSR can be injected into Proton, this is applicable to virtually every game that SteamOS 3 supports.
 
What we need is something that is as easy to implement into games and can work in all games like FSR with the quality of DLSS (quality setting or higher), IMO. Not sure that's ever going to happen however.
OGSSAA - just super sample it. Easy.

Something with quality that also offsets performance like DLSS? I do not think that is very feasible for an "easy integration" any time soon.
 
Something with quality that also offsets performance like DLSS? I do not think that is very feasible for an "easy integration" any time soon.

At least on UE5 games, TSR should be easy to integrate.
And like FSR, TSR should work on all hardware and isn't exclusive to nvidia RTX cards. It should work on nvidia MX and GTX cards.
 
OGSSAA - just super sample it. Easy.

Something with quality that also offsets performance like DLSS? I do not think that is very feasible for an "easy integration" any time soon.

I'm not terribly fond of OGSSAA, I much prefer the increased IQ from RGSSAA. :) Regardless, in either case, the performance impact is generally too large. Although that said, when I was still rocking my V5 5500, I had RGSSAA on all the time. And it's what kept me from buying MUCH faster NV GPUs for quite a long time as well.

The problem I currently have with DLSS is that it's difficult enough to integrate into games that I doubt the majority of games I play will ever get it, thus it's interesting but almost completely useless for me.

Compounding that, of course, is that it's proprietary to one vendor. If it was universally available to all vendors I'm not sure that would solve the issue of integration into games, but it might help.

As such for me, at least, both FSR and DLSS are not selling points for GPUs as both aren't something I'd likely use. For FSR it's due to questionable quality in current FSR implementations, for the other it's just the fact that it's almost nowhere to be seen in the games that I play.

Regards,
SB
 
Do you really need FSR for a 1280*800 native res ? Native / perfect pixel match on a portable device look nice...
 
Do you really need FSR for a 1280*800 native res ? Native / perfect pixel match on a portable device look nice...
In the case of Steam Deck, anything that can help those 8CUs at 1-1.6GHz reach great visuals is a welcome addition.
 
In the case of Steam Deck, anything that can help those 8CUs at 1-1.6GHz reach great visuals is a welcome addition.

If I am not wrong, the problem with FSR is that it needs high number of pixel to get OK visuals at a bit higher number of pixel.
And 1280x800 is not high enough count to FSR to have an OK effect on visuals, not near great visuals.
 
If I am not wrong, the problem with FSR is that it needs high number of pixel to get OK visuals at a bit higher number of pixel.
And 1280x800 is not high enough count to FSR to have an OK effect on visuals, not near great visuals.

Whatever we determine as "OK visuals" might be a little bit negatively influenced by a bunch of reviewers showing only 400% zoom screenshots without mentioning temporal quality/artifacts (which you cannot see in a screenshot).

If you have a monitor with high-ish pixel density, You can compare / emulate the actual difference in motion between native and FSR quality here:
https://forum.beyond3d.com/posts/2216901/


In the end, whatever one determines as "OK visuals" is going to be a subjective term. Perhaps one person will prefer to run their game at native 1280*800 and set everything on low to get 60 FPS, whereas another will gladly trade off some blurriness, lower texture detail and 30FPS in exchange for raytraced reflections, global illumination and SSAO.
 
so..., something amazing has been achieved, FSR is now universally applied to any game out there! This has been done using Wine/Proton (the layer to use Windows stuff on Linux) and it works on Linux. This also means that playing on Linux is faster than on Windows. even under the same testing conditions.

Glorious Eggroll, an engineer working for Red Hat explains how it works in this video. (16:00 minutes in, which I set in the link below, though the whole video is well worth the watch)


Warframe can be played at 250fps native 4k using this tech. Rocket League runs at more than 350fps 4k.

The Witcher 3 looks awesome, etc etc.
 
Last edited:
Back
Top