AMD FSR antialiasing discussion

  • Thread starter Deleted member 90741
  • Start date
Godfall RX 6900 XT vs RTX 3090 FSR Ray Tracing Performance - The FPS Review
August 24, 2021
The intent of this article was to focus on the performance aspects of FSR, and see what kind of a performance advantage potential it could bring. That said, we will leave with our opinion based on the performance experienced here, and our experience playing Godfall, to only use Ultra Quality. Anything below that does not look good. However, at Ultra Quality you get the big performance boost if you need it, and you sacrifice a little image quality in the process because it is not the original native resolution. If you know that going in, and you need the performance boost, then FSR will provide that for you.
 
You need to try Temple OS it's like windows but with more jesus

ps: How does lossless scaling work with unusual aspect ratio's (like 32:10)
pps: does it run outside of steam (if I want to use it on a non steam game)

I can get it to work with eyefinity( 53:9 ) so should work fine for 32:10.

The main problem is getting a windowed resolution close enough to scale from because keeping scaling at 1.0 in LS with resolution scale <100 in-game gives me worse performance in Cyberpunk( static fidelityFX resolution scale ). While using a tool like borderless windowed gaming to force a specific window resolution and then using that to upscale from lossless scaling gives extra performance.

6400x1080 vs. 5334x900x1.2 comparison

 
while playing RE2R I got my first ever FSR Quality pictures on Linux Pop OS via Wine and Proton-GE which means a base resolution of 1706x960 to native 1440p (my monitor's res).

I was mistaken before, and the pictures I got, yes, had FSR enabled but I set the game's resolution to 1440p, which was native -so it made no difference to actual 1440p).

These are the best pictures I got to compare 1440p native vs RE2R FSR Quality 1440p (skin is drier but not because of resolution, it's just that I spent more time in the police station when playing with FSR on and the skin gets dry, eventually):

NATIVE 1440p
rkU5XTr.png


FSR QUALITY (1706x960 INTERNAL TO 1440p)
KrT2RZB.jpg


NATIVE 1440p
5QDGMPj.png


FSR QUALITY (1706x960 INTERNAL TO 1440p)
JVoEuSq.jpg


NATIVE 1440p
aYvQyBh.png


FSR QUALITY (1706x960 INTERNAL TO 1440p)
ZYFOLH7.jpg


The slight loss of detail is well worth it! (it's hard to notice and the game still looks gorgeous). The framerate is crazy high by comparison or the power consumption goes really down at a fixed 60fps..
 
Last edited:
Especially interesting as he doesn't adjust the resolution of the game after the beginning..
I guess the lossless scaling scales a it bit and then windows or something takes it to the rest of the way.

Scaling factor is in one direction.
Pretty sure in his case he should use auto and just change the rendering resolution in game.
Have never tested auto, so I'm not sure it works.

manually set scaling factor should be set according to the render resolution.
I purchased the app and I set it to auto, it seems to work fine. However, the fact that you have to play windowed and that kinda breaks freesync can be a pita -though nVidia cards "Gsync" on Windows for freesync monitors has an option to work on windowed mode, although I havent tested it intensely).

In that sense, the Linux solution of using Wine and Proton-GE might have one advantage, which is that it allows you to play full screen.

eHqdUHa.png


Though freesync on Linux, at least on my nVidia card gives me some issues with random blackouts, but well..., that's another story and I havent tested more than 1 game.
 
Some screengrabs using Proton-GE for Linux.

Note that the IQ doesn't seem to change much in Age of Wonders 3, but the font size and UI size and detail is very different going from 1706x960 compared to native 1440p.

STREET FIGHTER IV FSR QUALITY (1706x960 INTERNAL TO 1440p)

YrLh8lS.png


No native 1440p picture to compare to this time.

AGE OF WONDERS 3 FSR QUALITY (1706x960 INTERNAL TO 1440p)

OSV1Gx5.jpg


AGE OF WONDERS 3 NATIVE 1440p

MU5qCbz.jpg
 
Note that the IQ doesn't seem to change much in Age of Wonders 3, but the font size and UI size and detail is very different going from 1706x960 compared to native 1440p.
Which is one of the main problems with a hacked-in global solution, you're upscaling the final frame including UI instead of only operating on the rendered frame before any post processes. I guess it's still good as an option though if you really need some performance boost.
 
Which is one of the main problems with a hacked-in global solution, you're upscaling the final frame including UI instead of only operating on the rendered frame before any post processes. I guess it's still good as an option though if you really need some performance boost.

NOTE - this is not saying that this is a bad way to look at things. For academic purposes, it's certainly interesting and a way to compare how well an upscaling algorithm compares to native rendering at a higher target resolution.

But I think people sometimes miss the point of things like this. For real world use, we should be looking at it thusly, IMO...
  • Person's machine can run X game at Y resolution with Z settings. For the sake of this illustration lets say the resolution is 1080p.
  • Let's say the person's machine has a display with a native resolution of 1440p.
  • Does the game when run full screen look better natively rendered at 1080p or upscaled with FSR to 1440p?
    • Basically does FSR look better than GPU or display hardware upscaling?
  • Alternatively, lowering some settings in order to render natively at the target framerate at 1440p (if it's possible).
    • Does the game look better to the gamer natively rendered at 1440p with lower settings or does it look better with higher settings at 1080p upscaled to 1440p with FSR?
In this example, if there is no natively supported upscaling algorithm, a hacked in version upscaled to 1440p might still be better looking than natively rendering at 1080p and having the GPU or display hardware upscaler do the upscaling. Obviously, better results would be had by the game implementing the upscaling in the game itself.

Perhaps some review site does this already, but I think it would be illustrative for sites to benchmark upscaling techiques with a locked target framerate while adjusting IQ settings as you increase resolution.

That would be more representative of how something like this is used, IMO.

Of course, the problem as always is how good something looks is always subjective. Perhaps a person would prefer lower IQ settings at native render versus an artifacted (which may or may not be immediately noticeable depending on the person) final upscale (this would be me). OTOH - someone may REALLY not want to turn down X IQ setting, so any potential artifacts are the lesser evil.

Regards,
SB
 
Last edited:
Which is one of the main problems with a hacked-in global solution, you're upscaling the final frame including UI instead of only operating on the rendered frame before any post processes. I guess it's still good as an option though if you really need some performance boost.
This could be really useful for games that lack native UI scaling actually.
I had to give up trying to play Crusader Kings 2 on a 1440p display because the UI was too small and scaling from lower resolution got too blurry to read comfortably.
 
NOTE - this is not saying that this is a bad way to look at things. For academic purposes, it's certainly interesting and a way to compare how well an upscaling algorithm compares to native rendering at a higher target resolution.

But I think people sometimes miss the point of things like this. For real world use, we should be looking at it thusly, IMO...
  • Person's machine can run X game at Y resolution with Z settings. For the sake of this illustration lets say the resolution is 1080p.
  • Let's say the person's machine has a display with a native resolution of 1440p.
  • Does the game when run full screen look better natively rendered at 1080p or upscaled with FSR to 1440p?
    • Basically does FSR look better than GPU or display hardware upscaling?
  • Alternatively, lowering some settings in order to render natively at the target framerate at 1440p (if it's possible).
    • Does the game look better to the gamer natively rendered at 1440p with lower settings or does it look better with higher settings at 1080p upscaled to 1440p with FSR?
In this example, if there is no natively supported upscaling algorithm, a hacked in version upscaled to 1440p might still be better looking than natively rendering at 1080p and having the GPU or display hardware upscaler do the upscaling. Obviously, better results would be had by the game implementing the upscaling in the game itself.

Perhaps some review site does this already, but I think it would be illustrative for sites to benchmark upscaling techiques with a locked target framerate while adjusting IQ settings as you increase resolution.

That would be more representative of how something like this is used, IMO.

Of course, the problem as always is how good something looks is always subjective. Perhaps a person would prefer lower IQ settings at native render versus an artifacted (which may or may not be immediately noticeable depending on the person) final upscale (this would be me). OTOH - someone may REALLY not want to turn down X IQ setting, so any potential artifacts are the lesser evil.

Regards,
SB
the issue GPU or display upscaling from 1080p to 1440p is that the IQ is soooooo horrible that I rather prefer to set the resolution at 1440p and play using low settings rather than 1080p and the fanciest settings.

Other advantages, which are important for me, are:

- less power consumption (electricity bill has gone through the roof as of late where I live)
- of course, a lot more performance, hence Quality instead of Ultra Quality FSR might be enough for me in most cases.

The differences are not that pronounced. Of course, a Digital Foundry video might be eye-opening but you normally don't zoom the image a 200% and the upscaling is free from the mediocre gpu upscaling artefacts. That's huge for people like me.

Which is one of the main problems with a hacked-in global solution, you're upscaling the final frame including UI instead of only operating on the rendered frame before any post processes. I guess it's still good as an option though if you really need some performance boost.
@alpe might have a point there.

The viewing distance I play at is about 70-85cm from my display, which isnt much for a 32" monitor and when playing Age of Wonders 3 at native 1440p the fonts and icons look very very small to the point that I prefer to play at a internal 1706x960.

Ideally, yes, FSR should be used as part of the rendered frame, but in this case I am not going to complain. An unsung hero.
 
Last edited:
the first ever game I played on Linux first (all the games I've played on Linux til this one were games I already had in Windows), Super Sports Blast.

The game is perfect to play with friends, family, kids, etc, and includes 3 games (Super Soccer Blast, Super Volleyball Blast and Super Tennis Blast) which you can buy separately or in this pack.


Might be the cartoony style, but well..., judge the IQ differences by yourself.

Native or upscaled via FSR, the game can be played at 120fps 1440p and the computer won't even break a sweat!

SUPER SPORTS BLAST FSR QUALITY (1706x960 INTERNAL TO 1440p)

kIMPEvq.png


SUPER SPORTS BLAST NATIVE 1440p

IoDb6ji.png



SUPER SPORTS BLAST FSR QUALITY (1706x960 INTERNAL TO 1440p)

8u6bKGG.png


SUPER SPORTS BLAST NATIVE 1440p

iXWYRKF.png



SUPER SPORTS BLAST FSR QUALITY (1706x960 INTERNAL TO 1440p)

NMh97N9.png


SUPER SPORTS BLAST NATIVE 1440p

dvbN509.png



SUPER SPORTS BLAST FSR QUALITY (1706x960 INTERNAL TO 1440p)

wZLK7Q8.png


SUPER SPORTS BLAST NATIVE 1440p

9cD8Yk7.png
 
FSR interview with Nick Thibieroz, Director of Game Engineering at AMD
September 4, 2021

According to the interview, Tim Lottes (currently a developer at Unity) provided the upscaling algorithm that FSR currently uses. He also worked on FXAA development for Nvidia prior to moving to Unity.

That's true. And his plans for fxaa 4 a decade ago were interesting. You have to read his blog via the wayback machine but you can still read it. Hopefully the link works ok

https://web.archive.org/web/2012010....com/2011/12/fxaa-40-stills-and-features.html
 
Retroarch adds FSR support.

https://www.libretro.com/index.php/retroarch-1-9-9-released/

The developers released RetroArch 1.9.9 which among other things, added in FSR. While they mention that FSR is supposed to be used in a compute shader, which they don't have in RetroArch, they instead "used it in a fragment pass anyway and it Just Works!". As always, people working on emulators come up with pretty clever solutions to improve how old games look. They also included some quick examples like these below:

Ys_Seven_undub-fsr.png


Below Ys Seven undub with the PPSSPP core in RetroArch. The image is zoomed in here by 2x. Left: no shaders, middle: AMD FSR shader, right: AMD FSR + SMAA.

Gradius_V_Japan-210902-125220.png



 
many interesting points, like:

The tech we ended up with did not require any specific hardware, and I wanted at the base level to be very accessible.

How do you provide something that looks great, is fast, works on every hardware, or as many possible platforms as possible?

This is very important. A colleague captured these 2 screenshots while playing RDR2 on a laptop featuring a i7-7700HQ and a 1050Ti, which are base level nowadays.

He ran the game at native 1440p without FSR and then he ran the game at 1600x900 internal -which is better than FSR Balanced but less than FSR Quality- on Proton GE and these are the results.

(he locked the framerate at 60fps but the game runs at more fps than that)

FSR OFF. Native 1440p
mB4sm8h.png


FSR ON | 1600x900p -> 2560x1440p
vCgW3pl.png
 
currently playing one of the most impressive games I've ever seen in my ENTIRE life. This is the cusp of one of my favourite engines, Cryengine. The british islands stages are a sight to behold. Gosh!

What Crytek created has been seldomly replicated.

Ryse: Son of Rome. FSR Quality: 1706x960 -> 1440p


k4r474J.png


Ryse: Son of Rome. No FSR | Native 1440p

lKnLJdj.png
 
AMD really need to integrate this into their driver software like they've done with RIS. Sure it could be better if implemented in-game, but it would be great to use it such that Freesync works and there are no specific game issues like it happens with the third-party softwares.
 
Ryse: Son of Rome is probably one of the best games to showcase FSR.

The game truly shines with FSR on, to the point of surpassing native. And that's at FSR Quality! :yes:

I guess this is due to the fact that Ryse: Son of Rome uses an implementation of Temporal Antialiasing which is perfect for the tech.

Note the texture detail: the "N" on the emperor's palace door, or the statue's overall detail.

Also, the framerate might have something to do with it, 'cos of the TAA. At FSR Quality the framerate is rock solid 60fps, achieving 80fps. Native's framerate is like 1 to 5 fps.

This is running on a 1060 3GB GPU, where the framerate TOTALLY plummets at native 1440p. (my GTX 1080 has some issues with the fans, so I am using this one).

FSR ON. FSR Quality -> 1440p (internal 1706x960)
OpiQzuc.jpg


FSR OFF. Native 1440p
M6PHtX9.png


 
Ryse: Son of Rome also has an internal upscaler, this is how the game looks setting the graphics to 1172x1008 (70% upscaling) using the internal upscaler

FSR OFF. Ryse's internal upscaler 1172x1008 to 1440p
N8Fh1hT.png


FSR ON. FSR Quality -> 1440p (internal 1706x960)
OpiQzuc.jpg
 
Your framerates seem to be capped at 60
this is due to the fact that I use two commands in the launch options (in the game's properties of Steam), one to play with FSR On and the other to lock the game at 60fps.

WINE_FULLSCREEN_FSR=1 DXVK_FRAME_RATE=60 %command%

If I dont lock the framerate the game runs between 120 to 180fps at 1706x960 internal and I prefer to play in silence, at 60fps the GPU/CPU run the game comfortly and with ease.
 
Last edited:
Back
Top