AMD FSR antialiasing discussion

  • Thread starter Deleted member 90741
  • Start date
Besides enemies it was the only moving object in my level. The noisy ghosting with FSR 2.0 in 4K is noticable. So i think it will be worse in games like Cyberpunk with cars, people, trains etc. moving around.
 
I think the question for me is FSR2.0 the best we're going to get from solutions that don't use ML?

DLSS still has a noticeable advantage in details like hair and transparent textures like chain-link fences.
 
They're testing older gpu's with it
Scaling is all over the place.
On the RTX 2080, DLSS Quality still gives 10% more fps compared to FSR 2 Quality, and with higher quality of upscaling. DLSS Performance gives you 15% more fps compared to FSR 2 Performance at a much much higher upscaling quality.

For those still in doubt, this is the difference that ML makes compared to hand crafted algorithms.
 
Seems that weaker GPU's that need the performance more actually get the smaller increases. 6700XT/6800 and things seem to get more intresting.
 
I wish they had performance numbers at the native scaling resolution for comparison.

For example having 1440p Native to compare against 4k Quality.

I'll see if I have time later to manually parse their numbers and try to assemble a chart with comparisons.

More importantly i'd wish to see more then just one game which was truly optimized for the first show to the public. How does it perform in Horizon, CP2077, god of war....
 
For image quality comparisons with these solutions I'd prefer to see them compared to a more "reference" image than "native." I'd like to see 1440p FSR 2.0 Quality, 1440p DLSS Quality compared to 1440p TAA super-sampled. Not sure how many games that support FSR 2.0, DLSS have an option to increase render-scale to 200%. Not sure the easiest way to get super-sampled output other than that. DSR on Nvidia seems kind of inconsistent. Super-sampled comparisons would be the best way to check things like colour accuracy.
 
For image quality comparisons with these solutions I'd prefer to see them compared to a more "reference" image than "native." I'd like to see 1440p FSR 2.0 Quality, 1440p DLSS Quality compared to 1440p TAA super-sampled. Not sure how many games that support FSR 2.0, DLSS have an option to increase render-scale to 200%. Not sure the easiest way to get super-sampled output other than that. DSR on Nvidia seems kind of inconsistent. Super-sampled comparisons would be the best way to check things like colour accuracy.

You can use NvInspector to force SSAA in games.
 
On the RTX 2080, DLSS Quality still gives 10% more fps compared to FSR 2 Quality, and with higher quality of upscaling. DLSS Performance gives you 15% more fps compared to FSR 2 Performance at a much much higher upscaling quality.

For those still in doubt, this is the difference that ML makes compared to hand crafted algorithms.

Or because dlss use dedicated cores while FSR 2 is stealing processing power from the main gpu cores? I'm probably wrong tho as I basically understand nothing about this


Edit: btw what if Nvidia modifies FSR 2 to run on tensor cores? Is that even feasible? As how specialized tensor cores are
 
You can use NvInspector to force SSAA in games.

Cool. I think FSR 2.0 vs DLSS vs native vs super-sampled is probably the best. Super-sampled will show you what your target is. Even if SSAA is running at 5 fps, just getting some frames to compare would be the "ground truth" or reference image for quality.
 
On the RTX 2080, DLSS Quality still gives 10% more fps compared to FSR 2 Quality, and with higher quality of upscaling. DLSS Performance gives you 15% more fps compared to FSR 2 Performance at a much much higher upscaling quality.

For those still in doubt, this is the difference that ML makes compared to hand crafted algorithms.

I love how you just decide to ignore all of the other data including the frame-rates on Ampere GPUs and the 2060 data in that review where the difference with FSR in regards to performance is <5% to just pull out this one result with the 2080 and claim victory for AI as a whole.

It's a great laugh, not gonna lie.
 
On the RTX 2080, DLSS Quality still gives 10% more fps compared to FSR 2 Quality, and with higher quality of upscaling. DLSS Performance gives you 15% more fps compared to FSR 2 Performance at a much much higher upscaling quality.

For those still in doubt, this is the difference that ML makes compared to hand crafted algorithms.

You also get this extra bonus with DLSS too.

FSo-E2ov-XEAAawm-F.jpg
 
Cool. I think FSR 2.0 vs DLSS vs native vs super-sampled is probably the best. Super-sampled will show you what your target is. Even if SSAA is running at 5 fps, just getting some frames to compare would be the "ground truth" or reference image for quality.

Super-sampled isn’t the ideal target for motion artifacts though. In my experience 4xDSR doesn’t fully squash shader and temporal aliasing. Maybe 16xSSAA would do the trick.
 
I love how you just decide to ignore all of the other data including the frame-rates on Ampere GPUs and the 2060 data in that review where the difference with FSR in regards to performance is <5% to just pull out this one result with the 2080 and claim victory for AI as a whole.
RTX 2060 @1440p:
DLSS Q is 2% faster than FSR Q
DLSS P is 5% faster than FSR P

RTX 2080 @4K:
DLSS Q is 10% faster than FSR Q
DLSS P is 12% faster than FSR Q

RTX 3080 @4K:
DLSS Q is 6% faster than FSR Q
DLSS P is 3% faster than FSR P

FSR 2 performance looks like this:

FSoKoooXoAALUTO
 
It works well (according to AMD) with dynamic resolution scaling.
Could other dynamic image alterations have a negative impact on the final upscaled image from FSR?
Variable Rate Shading, for instance. I know VRS should only be applied to insignificant portions of the frame, but if you're getting frame-to-frame differences in VRS application, could that cause a problem for FSR? Perhaps a flickering in a fairly static surface.
 
Back
Top