AMD's FSR 3 upscaling and frame interpolation *spawn

My understanding is FSR3 and DLSS require the same underlying information from the engine in order to generate their upscaled master frames and generated / interpolated frames. It seems strange the FSR3 components are somehow distinct from the DLSS components, and then to further suggest those FSR3 components were still active when the configuration option was to have them disabled.

It's also super easy for me to sit here in my office chair and speculate about someone else's engine and how they got it wrong ;) Hopefully they'll figure it out soon and get it re-implemented, perhaps this time also leaving those features disabled when they're not requsted / needed.

Edit: If DLSS is still enabled, then there are multiple wrappers out there which will translate DLSS calls into FSR3 calls, at least for framegen anyway. I used the DLFG-to-AMDFG "hack" in both Starfield and Cyberpunk to great effect on my 3080Ti and also my 1070MQ laptop. I was very happy with the results. Also for just simple upscaling, I preferred Cyberpunk's XeSS 1.2 to FSR2 for image quality results, especially while in motion.
 
I mean, they've got that double rate matrix multiplication pipe in RDNA4, and have had that huge wave matrix multiplication register in RDNA3 for a while, making use of those makes sense.

AI is better at figuring out things like cheap upscaling than humans are, given enough guidance. I suppose they could've just reverse engineered what the AI is doing (for those wondering how) but then again what the AI is doing is assumedly based on low bit depth matrix multiplication anyway since that's what it has access to and will be rewarded for optimizing for.

Either way, looking forward to more competition in upscalers. XESS is mostly good but has some ghosting problems on occasion.
 
I mean, they've got that double rate matrix multiplication pipe in RDNA4, and have had that huge wave matrix multiplication register in RDNA3 for a while, making use of those makes sense.

AI is better at figuring out things like cheap upscaling than humans are, given enough guidance. I suppose they could've just reverse engineered what the AI is doing (for those wondering how) but then again what the AI is doing is assumedly based on low bit depth matrix multiplication anyway since that's what it has access to and will be rewarded for optimizing for.

Either way, looking forward to more competition in upscalers. XESS is mostly good but has some ghosting problems on occasion.
The only new major formats supported for GFX12 WMMA are FP8/BF8 but they also introduced the very same support for dot4 instructions so it makes one really wonder if they truly implemented a faster special hardware path for matrix computations ?
 
It's worth reminding folks: upscaling (both in the time domain as frame gen and also in the visual domain as FSR or DLSS) isn't only about achieving super high framerate, it's also about saving power - sometimes quite a bit of power.

The Asus OG OC 4090 at my maximum tested stable overclock of 2865MHz at 1014mV cannot sustain the 100Hz refresh rate of my 3440x1440 monitor when playing Cyberpunk at all-uber settings and pathtracing enabled. Unfortunately, while it tries earnestly to accomplish said task, it will chew through more than 400W according to MSI Afterburner, and the whole PC will consume more than 550W at the wall as reported by my power monitoring gear.

Now, if I cap the framerate at 50FPS and enable frame generation and DLSS Quality, it will absolutely sit at the 100Hz maximum refresh of the monitor, while the card relaxes at a much more palettable and far less noisy 2640MHz at 925mV. Power draw of the card plummets to around 250W, and total system power stays under 375W with far lower case and radiator fan speeds. This is 175W less heat and related noise polluting my office space while providing a far more stable framerate and consistent gaming experience.

I feel like some folks get so hung up on maximal performance they forget the other very relevant uses for this technology.
 
This point of view can be an important aspect for those who play on a PC that already consumes a lot of watts. However, for those who play on an energy-efficient console, the potential performance increase of the fixed hardware is a more important aspect.
 
This point of view can be an important aspect for those who play on a PC that already consumes a lot of watts. However, for those who play on an energy-efficient console, the potential performance increase of the fixed hardware is a more important aspect.
Why can't it be both? Imagine how much more energy efficient a console will be when frame generation is fully available across all platforms? And then of course, developers may trade the recovered energy to spend on additional details or assets or physics or who-knows what.

It's literally two sides of the same coin. Use all your power for brute force, or let upscaling do a lot of the lifting, thus enabling even more interesting work to be done. The power budget is the power budget, might as well burn it all right?
 
Why can't it be both? Imagine how much more energy efficient a console will be when frame generation is fully available across all platforms? And then of course, developers may trade the recovered energy to spend on additional details or assets or physics or who-knows what.

It's literally two sides of the same coin. Use all your power for brute force, or let upscaling do a lot of the lifting, thus enabling even more interesting work to be done. The power budget is the power budget, might as well burn it all right?
I appreciate your attitude.
 
It is almost certain that they will develop this until they achieve 30 to 60 FPS.
I would hope so. That's a far more useful application of frame generation than 60fps->120fps, in my opinion. It could open up a huge amount more headroom for developers to play with, which will be important for next gen consoles in particular.
 
I didn't watch the whole video but i'm sure I didn't miss anything about methodology at the start but in case I did sorry in advance. Anyway did he try match any sharpening levels? I think some point in the past their tests were comparing fsr with sharpening vs dlss with none, it might explain the texture differences in cp2077.
 
I need to go back and look at CP2077 now that FSR3 is implemented. I know I personally preferred the look of XeSS over FSR2 when playing CP2077 on my older 1070-equipped laptop. Maybe the sharpening is the reason?
 
I didn't watch the whole video but i'm sure I didn't miss anything about methodology at the start but in case I did sorry in advance. Anyway did he try match any sharpening levels? I think some point in the past their tests were comparing fsr with sharpening vs dlss with none, it might explain the texture differences in cp2077.
I'm sure (but would need to check to be 100%) that FSR defaults with sharpening on but DLSS doesn't.
 
I hopped into CP2077 real fast; I only have options for FSR2.1 -- is FSR3 implemented and I somehow don't have the update and/or it isn't showing up for me?

Anyway, to @davis.anthony 's point, DLSS defaults to zero sharpening while FSR2.1 defaults to 50% (the slider says 0.50 on a scale of 0 to 1.) So maybe it's a default slider setting issue?

Edit: ah, I hadn't watched the video yet; I see it is pointed out FS2.1 is what CP2077 uses. Gotcha.
 
Last edited:
Back
Top