AMD FSR antialiasing discussion

  • Thread starter Deleted member 90741
  • Start date
Looks like a good solution for XSX|PS5, if Steve from GN can run it on Ryzen 5700G, guaranteed it will run on XSX at the very least.
 
FSR is very obviously a competitor to DLSS and it aims to bring down its popularity, mostly due to the latter being dependent on a specific hardware block that is only present in one IHV.

In general, the Ultra setting looks like a free >30% boost in general, with no perceivable difference in quality IMO. The Quality mode might still be a net positive for people with high resolution monitors but a hardware that can't keep up with it.
The other two don't seem like I'd use at all, save for e.g. a portable console with a high pixel density screen like the OneXPlayer (or perhaps the rumored SteamPal).

DLSS seems much better at upscaling low resolutions, especially the 1/4th resolution Performance modes. OTOH, there's never any ghosting / temporal artifacts. I'm guessing future iterations of FSR will focus on the lower modes, though it might need temporal data to do so.


What I'm surprised with is the performance upgrade on pre-RPM GPUs. It seems to not be making use of FP16 at all, so it's working great on Polaris and Pascal cards.
 
Fun fact: AMD named FSR quality presets in such a way as each of them would have a) similar name to that of DLSS presets and b) similar base resolution from which they upscale.
So I dunno about it not being "DLSS competitor". People will definitely see it as such.
They really should not do that IMO based upon what I have seen - as as soon as a game has both DLSS and FSR in it - there will be an incredibly rude awakening in terms of what the differences are.
In general, the Ultra setting looks like a free >30% boost in general, with no perceivable difference in quality IMO.
The differences for edge quality on larger shaped opaque surfaces is low between Ultra Quality and Native 4K (for example), but inner edge quality (i.e. texture detail) or aliasing is the same as it would be as the real internal resolution (162Xp).
One thing text reviews cannot show too easily, and even youtube reviews cannot show too easily, is that FSR does not have the anti-aliasing stability of 4K native. It being single frame based means any sub-pixel or differences between this frame and the next change the alias - it is not very temporally stable and it cannot be of course based on how it works. In person, I do not think it looks very 4K like - but it definitely looks better than a raw bilinear upscale of its real internal resolution. AKA 162Xp bilinear upscaled to 4K will look definitely worse than that same upscale done by FSR.
 
They really should not do that IMO based upon what I have seen - as as soon as a game has both DLSS and FSR in it - there will be an incredibly rude awakening in terms of what the differences are.

Yeah, was thinking that, their not playing in the same field. Are you working on a FSR video? ;)
 
Yeah, was thinking that, their not playing in the same field. Are you working on a FSR video? ;)
Yeah - it is done. I just ended up having to edit the video 2x because Youtube unfortunately ended up out compressing some of the aspects I talked about. Youtube is not the best platform to show case aliasing or anything finedetail as the compressor might even just remove it with a macro block or blur!
Uploading it again as I just write this.
 
OTOH, there's never any ghosting / temporal artifacts.
You will still get these from game's TAA which isn't going anywhere with FSR as it's not an antialiasing solution.
Remains to be seen which will end up being more visible - these TAA artifacts with FSR on top of them or typical temporal reconstruction artifacts in DLSS.

Of course if a game doesn't use TAA at all then there won't be any artifacts in it with FSR or not.
But such game would likely provide a very pristine native image quality meaning that even UQ FSR will likely result in some very apparent blurring / aliasing / loss of detail.

It seems to not be making use of FP16 at all
Hardly surprising for a spatial shader based upscaler.
 
I would like to see normal bilinear from 77% + CAS sharpening vs FSR Ultra Quality mode to see how good the actual scaling algorithm is.
Yeah this would be a good comparison. I'd also like to see a game with FSR compared that possibly supports custom resolutions to see what the performance is at the FSR internal resolution vs the native+FSR version which should give an indication of the actual perf cost of the algorithm itself?
 
The differences for edge quality on larger shaped opaque surfaces is low between Ultra Quality and Native 4K (for example), but inner edge quality (i.e. texture detail) or aliasing is the same as it would be as the real internal resolution (162Xp).

But on aliasing, FSR is supposed to be compatible with any type of antialiasing, TAA included.

You're right that it probably can't do much on texture detail, though new-gen games seem to be pushing for more geometry instead of e.g. normal maps, so I wonder if that could play in FSR's favor somewhat.


The default is a FP16 version, but there's also an optimized variant that natively uses FP32.
Interesting. Then why are Polaris cards getting similar scaling to Vega and RDNA ones?
Perhaps GCN GPUs in general have an excess of compute that is never effectively used in games?
Or could FSR just be really light on the compute shaders?
 
This tool should have stopped at quality mode and ditched the lower modes as they aren't good. Ultra should be Quality, and Quality should be Balanced.

Other than that, I don't see anyone having complaints about the Ultra Quality mode. The FPS benefits for that mode are a no-brainer and only meticulous analysis will demonstrate the differences.

I wonder if any developers will implement this tech before the temporal reconstruction pass.
 
I don't think FSR is a competitor of DLSS, the biggest interest of FSR it can work with any game and much more GPU than DLSS GCN and RDNA 1 and 2 GPU, Nvidia GPU series 1000.

I hope many teams will use it even on older games this is a good solution and a very cool one with the GPU stock problem.

It will help people with older GPU to have a framerate boost on somes games.
 
But on aliasing, FSR is supposed to be compatible with any type of antialiasing, TAA included.
What I mean there is the aliasing you see in the image even after the game's anti-aliasing has been applied (so the places it fails). So the moire patterns, the sub-pixel detail, the flicker, shimmer that one sees at 1622p but would not see does not see at 4K. FSR will have the level of aliasing of its real internal resolution (1622p in Ultra Quality), and not that of the native presentation.
I am not sure if there is a different way I can express it to get that point across!
 
So I checked it out using The Riftbreaker demo... and, at least on Nvidia.. my results were ok I guess.

I cropped all of these from 3840x1600, since I have an ultrawide monitor.

Native
Native.jpg

Ultra Quality
Ultraquality.jpg

Quality
Quality.jpg

Balanced
Balanced.jpg

Performance
Performance.jpg


In motion, I noticed the reduction in quality immediately. Even Ultra Quality looked soft in motion, especially the foliage. It's definitely passable if you're requiring some extra performance however. Anything lower than Ultra Quality and the IQ is just not acceptable to me. This is obviously only one game, and I can only speak for my own setup.. although it does appear to be the same thing on AMD GPUs as well.

The other thing I noticed is that in the Performance mode, you can see the FPS of 191 there, but that framerate was actually fluctuating between 170 and 195 constantly. It never remained still, despite the fact that absolutely nothing was changing in the game scene. I never noticed it in the other modes though.
 
The other thing I noticed is that in the Performance mode, you can see the FPS of 191 there, but that framerate was actually fluctuating between 170 and 195 constantly. It never remained still, despite the fact that absolutely nothing was changing in the game scene. I never noticed it in the other modes though.
You have ran into CPU limitation most likely.
 
Begun, the upscaling wars have.

I would like to see normal bilinear from 77% + CAS sharpening vs FSR Ultra Quality mode to see how good the actual scaling algorithm is.

Yeah, this should be the baseline upscale in these reviews. I've run 1440p and 1800p on a 4k TV with no sharpening and it wasn't half bad. Thin details like wires or ropes did suffer though.
 
Interesting. Then why are Polaris cards getting similar scaling to Vega and RDNA ones?
Perhaps GCN GPUs in general have an excess of compute that is never effectively used in games?
Or could FSR just be really light on the compute shaders?
It was designed not to be a heavy burden, but tax compute more than bandwidth.
And in GCN-times, there always has been the notion, that AMD card underperform compared to their paper-spec of pushing TFLOPs.

So my guess would be a little bit of both: Good, lightweight design of the FSR ED shader and kernel as well as unused compute.
 
So I checked it out using The Riftbreaker demo... and, at least on Nvidia.. my results were ok I guess.

I cropped all of these from 3840x1600, since I have an ultrawide monitor.




In motion, I noticed the reduction in quality immediately. Even Ultra Quality looked soft in motion, especially the foliage. It's definitely passable if you're requiring some extra performance however. Anything lower than Ultra Quality and the IQ is just not acceptable to me. This is obviously only one game, and I can only speak for my own setup.. although it does appear to be the same thing on AMD GPUs as well.

The other thing I noticed is that in the Performance mode, you can see the FPS of 191 there, but that framerate was actually fluctuating between 170 and 195 constantly. It never remained still, despite the fact that absolutely nothing was changing in the game scene. I never noticed it in the other modes though.

the fluctuating framerate may be due to boost/throttling that keeps happening?
 
Back
Top