VRS: Variable Rate Shading *spawn*

Does the Forza Horizon 4 update on Series X and S use VRS? Can't remember specifically what DF said in their video for it.
Dont remember to be honest, only remember it was little disapointing as xsx has some setting lower than xox (dont remember which one)
 
Dont remember to be honest, only remember it was little disapointing as xsx has some setting lower than xox (dont remember which one)
I don't believe it does. I believe the Forza Horizon 4 Series X/S upgrade was handled by another studio. Implementing VRS would likely require extensive work and was likely outside of the scope of the project.

We can rest easy knowing that Playground Games and Turn 10 are building the next generation of Forza games to take full advantage of the new hardware and its features. I still can't wait to see what the next Horizon game is going to look like.
 
I think this is what Matt Hargett was explaining: With their custom GE, they eventually won't need VRS as described by MS in order to improve performance as they'll improve more performance with their GE. And finally they probably thought it was more important to shade at full res all visible polygons and instead shade at higher res the most visible polygons (I'd agree with that).

My understanding is Sony's approach is to optimise early in the pipeline, Microsoft's approach is to optimise later in the pipeline. Common sense tells you that the earlier you discard unnecessary work the better but until somebody provides an explanation of what Geometry Engine does (and does not), you can't meaningfully the two approaches.
 
Last edited by a moderator:
My understanding is Sony's approach is to optimise early in the pipeline, Microsoft's approach is to optimise later in the pipeline. Common sense tells you that the earlier you discard unnecessary work the better but until somebody provides an explanation of what Geometry Engine does (and does not), you can't meaningfully the two approaches.
There is not really a different approach here. Mesh shaders are going to offer "early pipeline" discard. VRS will then allow for another step thereafter to help reduce shading costs on top of that discard.

PS5 dev tools at the moment just have no micro code assignment, or API for any sort of universal variable rate shading - presumably because it is not in hardware at all. At least not in hardware like a DX tier 2 variant might be.

It is not a difference priorities or approaches - just one box is more feature rich than the other in this area.
 
There is not really a different approach here. Mesh shaders are going to offer "early pipeline" discard. VRS will then allow for another step thereafter to help reduce shading costs on top of that discard.

PS5 dev tools at the moment just have no micro code assignment, or API for any sort of universal variable rate shading - presumably because it is not in hardware at all. At least not in hardware like a DX tier 2 variant might be.

It is not a difference priorities or approaches - just one box is more feature rich than the other in this area.
PS5 Has VRS, Confirms Activision Lead Artist
https://segmentnext.com/2020/04/09/ps5-has-vrs/



Things arent clear cut but maybe they have another solution?
 
I have heard that PS5 has no hardware VRS (no API, no microcode reference) as described above directly from two different graphics programmers working on PS5 games - two completely different dev studios. So I honestly am not sure why others say it does. I honestly trust the people I am talking to considering they are extremely competent.

That does not make the machine suddenly terrible or something. Maybe they can add in that to the SDK later on if it actually does support it in hardware - make an API portion of it just like MS has done. If not, well then it is just a difference between the machine's GPU capabilities.
 
Matt never confirmed VRS and can we stop using this quote with unintented interpretation by users? Matt said ultimately you'd use VRS and GE together. He gone on saying he doesn't deny or confirm PS5 having VRS.
 
I don't think lacking of vrs can make big difference but maybe lacking (according to Locuza) of mixed precision dot-product instructions for ML could have bigger potential impact (with hipotetical amd dlss equivalent)
 

PS5 does have VRS... Software VRS implemented by, you guessed it, Activision:

https://research.activision.com/pub...able-rate-shading-in-call-of-duty--modern-war

But then so does the PRO and the original PS4.
 
I have heard that PS5 has no hardware VRS (no API, no microcode reference)

I'm completely unfamiliar with the term "no microcode reference."

No API, I could still imagine the hardware is there but it's yet to be exposed. Might the absence microcode reference be a relatively strong indication that the hardware simply isn't there to expose?

I have to say, I don't particularly enjoy this "Geometry Engine vs VRS" slagging match that's begun. I'm not directing that at anyone in particular, I just think the general tone of the conversation is the kind of heated that doesn't befit the mien of curious nerds.

I do, however, think it's interesting to know whether hardware VRS is included within the PS5, because that can then open up avenues of discussion and exploration that I think are actually worthwhile e.g. did Sony not think it worth it? Is it a well developed technique to Microsoft but much less so/not at all to Sony? Does the technique perhaps work well in conjunction with ML techniques like DLSS, and are such ML techniques well suited to the strengths of the XSX GPU? If the technique works well with AMD's DLSS equivalent, and given that there are AMD patents for chiplet ML accelerators, is Sony skipping hardware VRS in the PS5 and focussing on it with a hypothetical PS5 Pro?
 
I have heard that PS5 has no hardware VRS (no API, no microcode reference) as described above directly from two different graphics programmers working on PS5 games - two completely different dev studios. So I honestly am not sure why others say it does. I honestly trust the people I am talking to considering they are extremely competent.

That does not make the machine suddenly terrible or something. Maybe they can add in that to the SDK later on if it actually does support it in hardware - make an API portion of it just like MS has done. If not, well then it is just a difference between the machine's GPU capabilities.
Very interesting...
Does the PS5 have primitive shaders (as per AMD patent https://patentimages.storage.googleapis.com/66/54/00/c86d30f0c1c61e/US20200193703A1.pdf) ?

Or instead mesh shaders (as described in MS patent for Index buffer compression: https://patentimages.storage.googleapis.com/54/e8/07/67037358a9952f/US20180232912A1.pdf) ?
This MS patent in fact provides the most concise description (that I have found thus far) of what the mesh shader stage of the 3D pipeline actually is:

"In some implementations, referring to FIGS. 2B and 2C, each index buffer block 107 may be read by a mesh shader stage 91. For example, mesh shader stage 91 may be a combination of any one or more of vertex shader stage 82, domain shader stage 88, and/or geometry shader stage 90. As such, the implementation according to FIG. 2B may have an understanding of compressed indices in the input assembler 80. Accordingly, logical pipeline 14b may not read and write indices. Further, in some implementations, when tessellation is enabled (e.g., in FIG. 2C), block index decompression may be performed in the input assembler and/or vertex shader stages. Additionally, when tessellation is enabled, the vertex shader stage may get merged with the hull shader stage, rather than with the geometry shader stage. When tessellation is disabled (e.g., in FIG. 2B), then the block index decompression may occur in the mesh shader stage.

During a vertex phase, mesh shader stage 91 may read the vertex position of a compressed index based on the original index reconstructed by the IA. Further, mesh shader stage 91 may transform the vertex position according to a transform function. Mesh shader stage 91 may store position in groupshared memory 109 along with the original index.

Mesh shader stage 91 may, during the primitive phase, read the connectivity information prepared by the input assembler 80. Mesh shader stage 91 may further read the transformed vertices out of groupshared memory 109. Additionally, mesh shader stage 91 may perform culling and if a primitive survives, then mesh shader stage 91 may indicate it as visible for the subsequent hardware. Further, each surviving vertex may be marked or indicated as such in groupshared memory 109.

During the attribute phase, mesh shader stage 91 may, for all surviving vertices output the vertex position, read the attributes from the vertex buffer 102, transform the attributes according to a transform function, and output the surviving transformed attributes. Subsequently, logical pipeline 14b may proceed to the rasterizer stage 94."
 

Sorry my dude, but these are just the same misunderstandings IMO that have been recycled for yonks. No harm in another round though, I suppose! :)

1)
t3f3rvF.png


They're explaining that processes on the GE happen before the stage where VRS comes in to play.

2) Matt H was explaining that while VRS is nice, there are more important performance gains being enabled by good use of the GE. (XSX also has a GE - of course it does, its AMD!). He even later stated he was absolutely not saying what features were or were not supported. He can't - he's NDA'd!

3) That's just a patent, and it's not for VRS. It seems to be more like dividing screen space up into geometrically described sections and potentially using different resolutions for the gGeometry overlapping in each. I think there's a long way to go to turn that into something like Tier 2 VRS, which can work on signal graphs of previous frames and independently on different buffers / stages in your rendering pipeline.

(The usefulness and final output quality from VRS is all in the implementation)

VRS has not been confirmed or even hinted at for PS5 publically by anyone in the know so far. If it is supported in hardware, it's not been exposed yet (see @Dictator 's comments).

There's always software though, so the VRS haters (not you I know!) need to lay off the juice because they'll probably end up getting some version of it in a game they like somewhere down the line! :runaway:
 
there is already a use of software VRS in CoD 2020 on PS4, is it also used in the PS5 version ?

Did DF notice the use of VRS on the PS4 version ? I guess if used correctly it's very hard to detect ?
 
"That does not make the machine suddenly terrible or something. Maybe they can add in that to the SDK later on if it actually does support it in hardware - make an API portion of it just like MS has done. If not, well then it is just a difference between the machine's GPU capabilities."

many people will have hard time to swallow this
 
Things arent clear cut but maybe they have another solution?
Well I guess you guys missed this part and jumped on my like a bunch of vultures.
Maybe they implemented some other solution which is similar, but not traditional hardware supported VRS? :p
Maybe the one implemented on PS4 Pro can be improved.

Maybe, who cares? Maybe it just doesnt exist and will focus just on Geometry Shaders.
Truth be told, I am not very knowledgeable on the matter. I just posted those to get clarifications and opinions from those that know more
 
Last edited:
similar opinion to nxgamer but interesingly its microsoft dev and afaik microsoft didn't cut this 256bit zen2 operations

MS want commonality with the PC - that's the reason for the move to the GDK. Games that do use AVX256 need to be able to run just as fast on Xbox with minimal work. Might also increase the flexibility of cloud units - they won't always be full up with games.

Downside is that MS have to be able to deal with high thermal density of AVX256 no matter what, while staying almost silent. Tiny bit more die too.

Sony probably have a bit more leeway to cut back on this particular aspect of the CPU. It's all about dem tradeoffs, and they won't be the same for everyone.
 
I'm completely unfamiliar with the term "no microcode reference."

No API, I could still imagine the hardware is there but it's yet to be exposed. Might the absence microcode reference be a relatively strong indication that the hardware simply isn't there to expose?
Yea, it's a good way to look at it. It's like asking a Zen 2 motherboard to support Zen 3 CPU; the zen 3 support would be present in the microcode for it to be able to support it.

I do, however, think it's interesting to know whether hardware VRS is included within the PS5, because that can then open up avenues of discussion and exploration that I think are actually worthwhile e.g. did Sony not think it worth it? Is it a well developed technique to Microsoft but much less so/not at all to Sony? Does the technique perhaps work well in conjunction with ML techniques like DLSS, and are such ML techniques well suited to the strengths of the XSX GPU? If the technique works well with AMD's DLSS equivalent, and given that there are AMD patents for chiplet ML accelerators, is Sony skipping hardware VRS in the PS5 and focussing on it with a hypothetical PS5 Pro?
The RB+ is required to support VRS on the 3D pipeline for RDNA architecture. It may have arrived too late in their design process to incorporate it, or in their eyes perhaps ultimately too much effort to redesign to support it.
I would look at VRS, DLSS, Mesh Shaders etc, as just being hardware based options for developers to leverage. Those particular functions can all be done on compute shaders for those teams that want to invest the time and effort to do it. If not, there is universal API support for it.

And being able to make your own versions of these features doesn't necessarily imply you can outperform the hardware versions of these functions. So it's quite an endeavor for teams to decide 'today', that they will roll their own solution vs using what is now available. Previously we've seen other teams roll their own versions of virtual texturing well before Tiled Resources was available on DX. Once it did arrive, most opted to continue using their own systems due to the inherent flexibility they had and control they had over their own solution. Very few companies already engaged in their VT systems leveraged Tiled Resources as a result. Quite frankly TR was too late to the scene. VRS seems to have arrived early enough such that only a few particular engines have gone forward to create their own software version of it. And UE5 is also an exception case for leveraging their own compute shaders to perform stuff the 3D Pipeline is not capable of.

The best thing about compute shader solutions is the ability to back port as far back as DX11 cards.

imo, I don't think there is a mid-gen refresh coming. The node shrink would not be significant enough to warrant a refresh and still keep the price points to where they are today. Next generation after this one will be interesting however. Curious to see how they intend to tackle it.

Dirt 5's VRS solution is not ideal, perhaps over compensating for other issues, thus they cranked up it's shading rates to cover a larger area. I think if they decide to patch it ( they won't ) if they could fix performance issues with XSX, they could optimize the VRS to not smear out so many details in the image.
 
Last edited:
Back
Top