AMD FidelityFX on Consoles

invictis

Newcomer
AMD has intentions of releasing FidelityFX on both RDNA1 and RDNA2, including both next gen consoles.
Reading through AMDs info on it, one of the main points of it is VRS. AMD make note to say it is Direct X 12 U VRS.
We know that PS5 doesn't have VRS, and certainly doesn't use DX, so how will FidelityFX fit in, or differ with the PS5?
Of note is that AMD have said that FidelityFX is being used in Dirt 5 for example, and we know that Dirt 5 on XSX is using VRS, but not on the PS5.
 
Didn't they say that they played around with ML to scale textures for VRS if the higher resolution texture isn't yet in memory?
 
So far, AMD develops those things very open (in contrast to additions from nvidia). Therefore if AMD releases the superresolution thing, it should also run on nvidia hardware. So I really can't wait that AMD releases it and the nvidia-only, "non future-prove" (because sooner or later older cards get incompatible and newer cards get incompatible with older games) DLSS would vanish for an open solution.
 
AMD has intentions of releasing FidelityFX on both RDNA1 and RDNA2, including both next gen consoles.
Reading through AMDs info on it, one of the main points of it is VRS. AMD make note to say it is Direct X 12 U VRS.
We know that PS5 doesn't have VRS, and certainly doesn't use DX, so how will FidelityFX fit in, or differ with the PS5?
Of note is that AMD have said that FidelityFX is being used in Dirt 5 for example, and we know that Dirt 5 on XSX is using VRS, but not on the PS5.


AFAIK, FidelityFX isn't tied to the DX12 API.
Similarly, VRS or any form of foveated rendering aren't tied to DX12. The specific VRS method present in the DX12U suite is of course tied to DX12.

We know the PS5 doesn't support DX12 Ultimate's VRS, which of course it doesn't because Sony wouldn't use Microsoft's API.

That's not to say the PS5 doesn't support any form of hardware accelerated foveated rendering. It probably does considering the amount of patents Sony has on the subject with applications going as far as 2016.

It's true that the PS5 version of Dirt5 isn't using any form of foveated rendering, but it also seems like it wasn't needed either, considering its performance.
 
AFAIK, FidelityFX isn't tied to the DX12 API.
Similarly, VRS or any form of foveated rendering aren't tied to DX12. The specific VRS method present in the DX12U suite is of course tied to DX12.

We know the PS5 doesn't support DX12 Ultimate's VRS, which of course it doesn't because Sony wouldn't use Microsoft's API.

That's not to say the PS5 doesn't support any form of hardware accelerated foveated rendering. It probably does considering the amount of patents Sony has on the subject with applications going as far as 2016.

It's true that the PS5 version of Dirt5 isn't using any form of foveated rendering, but it also seems like it wasn't needed either, considering its performance.
Dirt 5 also had performance drops in PS5. Do you think Code masters wouldn't have added VRS to the PS5 version to have it get even better performance, while they added it to the XSX and PC versions?
If the PS5 had VRS, it would have been used.
 
Dirt 5 also had performance drops in PS5. Do you think Code masters wouldn't have added VRS to the PS5 version to have it get even better performance, while they added it to the XSX and PC versions?
If the PS5 had VRS, it would have been used.
Indeed, IIRC, some here were mentioning that the VRS options are not available for nvidia cards either. So i'm not even that sure that's it's necessarily coded for DX12 PC branch. Frankly not sure how they got it onto Xbox without doing it.
 
Dirt 5 also had performance drops in PS5. Do you think Code masters wouldn't have added VRS to the PS5 version to have it get even better performance, while they added it to the XSX and PC versions?
If the PS5 had VRS, it would have been used.

Like I said, the PS5 is not supporting DX12 Ultimate's VRS, nor will it ever. That topic is moot.

I get that Microsoft's technical marketing department was successful at making a big deal out of the PS5 not supporting specific implementations of Microsoft's own API (well, duh), and Sony not caring about publicly sharing the technical details of their SoC happened to play right in their field.
Claiming the PS5 doesn't support Microsoft's "patented VRS" implementation is as obvious and meaningless as claiming the Series X doesn't support GNM fragment shaders (well, duh).


As for the performance drops, they seem so rare that I doubt Codemasters would find it worth the development time and cost to implement whatever foveated rendering technique that Sony supports in their hardware. Like all practical VRS implementations I've seen so far, it doesn't look like it did any miracles to the DX12 consoles anyways.
 
Like I said, the PS5 is not supporting DX12 Ultimate's VRS, nor will it ever. That topic is moot.

I get that Microsoft's technical marketing department was successful at making a big deal out of the PS5 not supporting specific implementations of Microsoft's own API (well, duh), and Sony not caring about publicly sharing the technical details of their SoC happened to play right in their field.
Claiming the PS5 doesn't support Microsoft's "patented VRS" implementation is as obvious and meaningless as claiming the Series X doesn't support GNM fragment shaders (well, duh).


As for the performance drops, they seem so rare that I doubt Codemasters would find it worth the development time and cost to implement whatever foveated rendering technique that Sony supports in their hardware. Like all practical VRS implementations I've seen so far, it doesn't look like it did any miracles to the DX12 consoles anyways.
I really don't know what some people have against VRS that they must write such misinformation.
You can also use VRS via DX11/OpenGL/Vulkan. It is not DX12 exclusive just a name for a technical feature, that has nothing to do with DirectX.

It is also only one feature to reduce the workload a bit. Esspecially good for VR with eye-tracking.
VRWorks - Variable Rate Shading (VRS) | NVIDIA Developer
 
Last edited:
I really don't know what some people have against VRS that they must write such misinformation.
You can also use VRS via DX11/OpenGL/Vulkan. It is not DX12 exclusive just a name for a technical feature, that has nothing to do with DirectX.

It is also only one feature to reduce the workload a bit. Esspecially good for VR with eye-tracking.
VRWorks - Variable Rate Shading (VRS) | NVIDIA Developer
The only reason that people keep posting weirdly dismissive or incorrect comments about a rendering technique like VRS, is because one of the consoles does not support it in a standardised manner codified by the rest of the graphics industry at large. If it were just another technique supported by both consoles, like say... ray tracing... I honestly think people would be more excited about its implications and use-cases.
VRS is neat IMO, I look forward to seeing what devs do with it.
----
@invictis - Fidelity FX is a branding for a suite of effects that AMD offers for developers to plug into their engines, should they fit. They are designed to run well on AMD hardware by default (less consideration or no consideration of other hardware). It does not have much to do with whether a game uses Direct X or not. If a developer wanted to use a Fidelity FX effect on the PS5, like contrast adaptive sharpening, they would just have to make sure such an effect hooked up properly to work with the graphics programming language used there. That is about it really.
Indeed, IIRC, some here were mentioning that the VRS options are not available for nvidia cards either. So i'm not even that sure that's it's necessarily coded for DX12 PC branch. Frankly not sure how they got it onto Xbox without doing it.
I have access to the beta branch of that game, and I honestly cannot recall any VRS settings on AMD or NV cards.
 
I really don't know what some people have against VRS that they must write such misinformation.
You can also use VRS via DX11/OpenGL/Vulkan. It is not DX12 exclusive just a name for a technical feature, that has nothing to do with DirectX.

It is also only one feature to reduce the workload a bit. Esspecially good for VR with eye-tracking.
VRWorks - Variable Rate Shading (VRS) | NVIDIA Developer

The misinformation here is being done by people conflating --> DX12 Ultimate's VRS <-- with every other foveated rendering technique out there. All of these are examples of foveated rendering:

VRWorks VRS -> nvidia's tools for implementing foveated rendering across several APIs with a focus on Virtual Reality applications
DirectX 12 Ultimate VRS -> Microsoft's implementation of foveated rendering with a focus on 2D panels, obviously limited to the devices that use DirectX12 like PCs (Intel, Nvidia, AMD) and Xbox consoles.
XR Adreno Foveation is Qualcomm's name for supporting foveated rendering with a focus on VR in their Adreno GPUs starting with Snapdragon 845 back in 2018, and it supports OpenGL and Vulkan. They seemingly extended the feature with a focus on 2D panels for the Adreno 660 in Snapdragon 888 (could be just marketing and it's supported on all Adreno 6xx though). I doubt the Adreno Foveation/VRS is supported in DirectX12, even if Qualcomm does write drivers for Windows.



On whether the PS5 supports some form of foveated rendering or not, I'll just leave these three links to patents (AFAIR there's a bunch more but these are what I found) registered by Sony and let people take their own conclusions:

- 2018 (originally filed in 2016): https://pdfaiw.uspto.gov/.aiw?Docid=20190384381
- 2019 (originally filed in 2016): https://pdfaiw.uspto.gov/.aiw?Docid=20190156794
- Filed in 2014 by Mark Cerny: https://patentscope.wipo.int/search/en/detail.jsf?docId=WO2015153167

Has Sony talked about the foveated rendering implementations they have in store for the PS5? No they haven't, like they haven't talked about almost anything low-level in their SoC to any tech/gaming outlet or during conferences, unlike Microsoft.
I think if they ever do talk about it is if/when they talk about optimizations for PSVR2.


The only reason that people keep posting weirdly dismissive or incorrect comments about a rendering technique like VRS, is because one of the consoles does not support it in a standardised manner codified by the rest of the graphics industry at large.
And the only reason other people keep posting incorrect or weirdly glorified clamours over a broadly studied and widely implemented concept like foveated rendering is because one console supports an API-specific implementation and the other does not.
This goes both ways.

As for the "dismissive" part, well yes I agree that DX12's VRS will be a bit dismissible as long as real-life results look like this:

IgjPNzR.jpeg



Perhaps DX12U's Tier 2 implementations will bring more relevant performance boosts, but so far it doesn't look like a game-changing feature.
 
Last edited by a moderator:
The only reason that people keep posting weirdly dismissive or incorrect comments about a rendering technique like VRS, is because one of the consoles does not support it in a standardised manner codified by the rest of the graphics industry at large. If it were just another technique supported by both consoles, like say... ray tracing... I honestly think people would be more excited about its implications and use-cases.

It's console wars what do you expect? Anything mentioned about PS5 SSD/IO and compression technique advantages get easily dismiss no different than XBSX compute and current variable refresh rate advantage. Instead of embracing each console's strengths and what it means to the industry as a whole, it's the usual par for the course hot takes and passive-aggressive attitudes in these threads.
 
I think right now when we are still early in next gen phase and we have a lot of spare power is easy to dismiss things like VRS. But with consoles fixed hw vrs and any other rendering tech that gives us free performance boost will be more and more important over time.

"Tier 2 VRS allows for a free boost in performance with minimal visual impact. As we see more adoption of 120+ FPS and higher fidelity effects, it’s become increasingly important that we spend our GPU budget in all the right places, making Tier 2 VRS a welcome tool to help tackle the next generation of rendering."
 
it's too early to dismiss anything because quite frankly, nothing has been put to use except for raw capabilities against their own software stack. You won't really know how any of these features will really affect things until the development of games have had sufficient time to mature away from last generation.

Besides, I'm not expecting a long lasting gem of optimization genius to happen during the COVID era.
 
It's console wars what do you expect? Anything mentioned about PS5 SSD/IO and compression technique advantages get easily dismiss no different than XBSX compute and current variable refresh rate advantage.

If there's one thing I'd like the PS5 to get is VRR as well as games that fully embrace it.
I think Microsoft did the right thing at supporting it on release, as there will probably be a significant time period between the feature being present in the console and devs working on a proper implementation.


VRS can be used to implement foveated rendering but there's other use cases like applying content adaptive shading to rasterization in general so VRS is helpful beyond cases involving just stereoscopic/foveated rendering ...
Yes, and that flexibility is of course a good thing.

But AFAIK all the current implementations are still based on the same reverse MSAA method that has been studied for while, and it consists of providing a tile buffer and coarse quads before the shading step.

vJvNoeI.png


We do know RDNA2, Gen11+12 and Turing+Ampere have this in hardware, but it seems Activision also made it work quite successfully using compute shaders only and they even claim better performance than hardware implementations (I can't tell if they compared it to Tier1 or Tier2 VRS though).
It wouldn't be the first time a hardware solution gets sidelined because a compute-only method was found to be better performant. I think that's what happened with AMD's earlier form of Primitive Shaders in Vega. AMD cancelled their support in the drivers because devs had started using compute-based culling methods more efficiently (eerm.. isn't there like a NDA that could have expired on this, @Rys? asking for a friend).
 
As for the "dismissive" part, well yes I agree that DX12's VRS will be a bit dismissible as long as real-life results look like this:
That's a 2-5% gain in those charts. That's the difference between 57 fps and 60. People can be dismissive about it's usefulness, but I'm sure if a game didn't use it and ran at 57 and did in an update and hit 60 they would be pretty happy about the change.

Wolfensein 2 got a 5-15% gain iirc. That's enough to not be dismissed, right?
 
That's a 2-5% gain in those charts. That's the difference between 57 fps and 60. People can be dismissive about it's usefulness, but I'm sure if a game didn't use it and ran at 57 and did in an update and hit 60 they would be pretty happy about the change.

It's a 5% change only when using "performance mode" which from what I've seen becomes too noticeable, and at which point I wonder if we're not better off just reducing the render resolution by ~10% altogether and pair it with RIS or equivalent (or of course DLSS2 if available and a nvidia GPU). Per that article's own words:
We do not recommend using Hitman 3's VRS performance mode, as there are settings that can be lowered to deliver better-looking results and a larger overall performance impact.


Quality mode is providing a 0.68% performance advantage at 4K, 1.8% advantage at 1440p and 0.25% at 1080p, for which I would indeed consider if its implementation was ever worth the effort. Especially considering these are average non-vsynced framerates, not locked.


Wolfenstein 2 that you mentioned gets similarly small performance gains in quality mode:

GI1I3X0.png
 
  • Like
Reactions: snc
Back
Top