VRS: Variable Rate Shading *spawn*

33% speed up, very impressive. Much better than what we have seen so far from VRS implementations.

What does he mean with bigger wins on Xbox compared to PC though? I assume with some PC parts he means Intel GPUs that only use VRS Tier 1. DX12U GPUs like RDNA2, Ampere and Turing should benefit the same way,
 
33% speed up, very impressive. Much better than what we have seen so far from VRS implementations.

What does he mean with bigger wins on Xbox compared to PC though? I assume with some PC parts he means Intel GPUs that only use VRS Tier 1. DX12U GPUs like RDNA2, Ampere and Turing should benefit the same way,
I was under the assumption this could be a result of executeIndirect having some additional options under consoles on DX12 that aren't exposed on PC. You are more likely to leverage executeIndirect on console than on PC overall from what I understand in general.
 
33% speed up, very impressive. Much better than what we have seen so far from VRS implementations.

What does he mean with bigger wins on Xbox compared to PC though? I assume with some PC parts he means Intel GPUs that only use VRS Tier 1. DX12U GPUs like RDNA2, Ampere and Turing should benefit the same way,
The amount of Speed Up will depend on how bounded the GPU is in the Test - for example when they did Gears, they found the PC at 4K on a 6800xt or whatever got less help from VRS at 4K than XSX. But when they went above 4K or massively increased load on 6800xt (with higher settings) they saw greater gains.
 
VRS has a lot of performance potential. It's a pity that PS5 GPU doesn't support it.

You might be surprised but quite a few modern high-end deferred renderers are competitive with renderers that uses a hardware implementation of VRS. Most reasonably modern deferred renderers use compute shaders to calculate their lighting pass and you can't really apply the complete VRS hardware pipeline in these instances since the hardware implementation of VRS is a part of the pixel shading pipeline state. "Sparse lighting" as mentioned in one of the recent videos in this thread is mostly a software technique that uses coverage mask information generated from the hardware pipeline ...

33% speed up, very impressive. Much better than what we have seen so far from VRS implementations.

What does he mean with bigger wins on Xbox compared to PC though? I assume with some PC parts he means Intel GPUs that only use VRS Tier 1. DX12U GPUs like RDNA2, Ampere and Turing should benefit the same way,

On Xbox you can read compressed depth surfaces but I think that applies to any modern AMD HW ...
 
Are we really 100% sure PS5 can't use hardware based VRS?

I found the evidence to support that theory lacking. Sony could have done some tweaking to the GFX chip that we don't know about.
 
It doesn't have the RB+s for it.

How do you know? I just skipped through the Road To PS5 video, and there's seemingly no information on that.

Remember, only official sources count. A GPU ID in Linux does not count, as Sony could have done some tweakings we don't know about and didn't bother to change the name. Twitter also does not count, as messages there could be faked.

Official sources like AMD say it's using RDNA2, so then HW-VRS is possible.
 
Last edited:
Both XSX and ps5 gpus are custom rdna2 chips. The HW needed for vrs t2 sits in new rops which XSX has. Ps5 still uses rdna1 rops which are lacking this feature afaik
 


Sony PS5 Console: Architecture and Technical Specifications in Detail | ITIGIC

"To apply this, the RDNA 2 for PC and Xbox have had to renew the Render Backend units, which are what we call ROPS in a GPU. Where AMD traditionally separates them into Z-ROPS and Color ROPS. Well, on PS5 they are the same as RDNA and not those of RDNA 2. The same can be said of the rasterized unit, which is also RDNA. The consequences? The lack of support for Variable Rate Shading by hardware, although it is possible to do it by software."
 
Those aren't credible sources.

For example

Well, on PS5 they are the same as RDNA and not those of RDNA 2. The same can be said of the rasterized unit, which is also RDNA.

Where is the evidence supporting that statement?

Twitter guy is also not a source. "They most likely use the older RB design.." seems like he is not even sure of it himself.
 
Last edited:
Those aren't credible sources.

For example



Where is the evidence supporting that statement?

Twitter guy is also not a source. "They most likely use the older RB design.." seems like he is not even sure of it himself.
Locuza and others work together to xray the silicon directly and can count the number of RB units physically seen. If PS5 had RB+ units PS5 would have 128 rops effectively which is a waste of silicon.
These are the older RB units from RDNA 1.

this topic has been discussed at great length in this forum with more sources and information laid out, you can follow the discussion pro/against it. There are several, You are welcome to use the search function above.
 
Last edited:
33% speed up, very impressive. Much better than what we have seen so far from VRS implementations.

What does he mean with bigger wins on Xbox compared to PC though? I assume with some PC parts he means Intel GPUs that only use VRS Tier 1. DX12U GPUs like RDNA2, Ampere and Turing should benefit the same way,
Variable Rate Shading not good enough?According to current situation,VRS can lead to Game Pixel error problem,the worst is saving performance of VRS are not controllable(bring the fluctuates too much)
 
can lead to Game Pixel error problem

Explain what you mean by this, "game pixel error problem".

Your next statement is also a semantically wrong as VRS is entitely under the direct control of the developers. They could design their implementation so it always applies to a minimum set amount if it was absolutely desired. It's generally not.

The best systems are the genuinely dynamic and multifaceted that use VRS and DRS. As documented through blogs and developer interviews linked earlier, in most situations their VRS implementation works well enough to prevent DRS from needing to kick in. This provides for better image quality levels than having to lower the resolution.
 
Variable Rate Shading not good enough?According to current situation,VRS can lead to Game Pixel error problem,the worst is saving performance of VRS are not controllable(bring the fluctuates too much)
VRS can be applied to individual buffers should they choose to, it's much more granular if developers want it to be. When it was being marketed as a blanket saver, they apply VRS on the end product, and you get a blurry mess. But when applied appropriately to the layers that should have it during the rendering of the final output you're going to get a much better result. I think we are a far way from seeing what VRS is capable of. Much like it took until the end of last generation to see what the consoles where capable of. There are a great deal of many features this generation that have yet to be explored.
 
My only worry about VRS is the lack of (h/w) support for it on PS5. If that's the case then there won't be as much incentive for developers to try to find great usage for it, and we will continue getting "cheap" implementations like these we've seen so far.
 
Back
Top