Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
View attachment 5681 View attachment 5682 View attachment 5683 View attachment 5684
Don't pay too much attention to the performance stats in the corner. After changing the settings the game took a few seconds to normalize. "Performance" mode definitely takes a hit in image quality, but "quality" I couldn't tell the difference. "Balanced"... Only if I was really looking.

Screenshots were taken at 1080p with all settings at maximum.
It should be noted that the XSS/XSX uses the quality setting, according to The Coalition.
 
I've given up following the discussion, it's no use to prove things to people who only want to argue in bad faith. They already have a predetermined view and they only enjoy baiting you. Let them have their little victory.

And even after a warning from authorities, some still continue what was said to quit it.

VRS is a tech that can be used ontop of other image technologies, and where applied well, it does its job accordingly imo. Its the 'dlss' for the xbox series consoles, its in early stages now and results are mixed, but its already improving and developers seem positive about it. We see a healthy improvement over not using it in Eternal.
 
Yep there are 79 pages in this thread i bet 40 is about VRS, same arguments over and over again.

Instead of arguments, are there a lot of relevant results on ISO hardware out there?

All I see is either:

1 - convoluted napkin math between two different consoles where one of them is using regular VSync, so pretty useless IMO;

2 - 3dmark's VRS test that shows a whopping and unrealistic 30% performance boost with VRS 1;

3 - PC benchmarks from enthusiast websites and users showing actual games where the VRS toggle results in negligible performance boosts.



My guess is VRS is still going to be negligible for anything but low-end / low-power architectures and that are ALU-bound like APU. If e.g. @Dictator wants to show off VRS then using a 15-28W Tiger Lake with a 96-EU Xe might be the best use case. Doing so on the RTX cards he often uses for performance comparisons is probably going to be an exercise in futility, I suspect.



Don't pay too much attention to the performance stats in the corner. After changing the settings the game took a few seconds to normalize.
Could you show us your results "after normalizing", as well as your system specs?
The other user results I'm seeing on the Internet point to the same negligible performance differences I'm seeing in your screenshots.
 
Instead of arguments, are there a lot of relevant results on ISO hardware out there?

All I see is either:

1 - convoluted napkin math between two different consoles where one of them is using regular VSync, so pretty useless IMO;

2 - 3dmark's VRS test that shows a whopping and unrealistic 30% performance boost with VRS 1;

3 - PC benchmarks from enthusiast websites and users showing actual games where the VRS toggle results in negligible performance boosts.



My guess is VRS is still going to be negligible for anything but low-end / low-power architectures and that are ALU-bound like APU. If e.g. @Dictator wants to show off VRS then using a 15-28W Tiger Lake with a 96-EU Xe might be the best use case. Doing so on the RTX cards he often uses for performance comparisons is probably going to be an exercise in futility, I suspect.




Could you show us your results "after normalizing", as well as your system specs?
The other user results I'm seeing on the Internet point to the same negligible performance differences I'm seeing in your screenshots.
A lot? No but the best one yet despite being posted in this thread like hundred times already still haven’t been read by many


https://devblogs.microsoft.com/directx/gears-vrs-tier2/

And here you have detailed information about overall performance gain and in each rendering step.
 
A lot? No but the best one yet despite being posted in this thread like hundred times already still haven’t been read by many


https://devblogs.microsoft.com/directx/gears-vrs-tier2/

And here you have detailed information about overall performance gain and in each rendering step.

I've read that blog post several times and even quoted it more than once. It's also 6 months old as of today.
It's a great read, but it is neither an indicator of average performance (i.e. select scenes were chosen for demonstration purposes) nor is it performed by an independent entity like a reviewer or an user.

The games are already out there. Where are the results showing an average 10-14% performance boost for VRS Quality and Balanced on Gears Tactics, Gears 5 and Doom Eternal?
 
Last edited by a moderator:
I've read that blog post several times and even quoted it more than once. It's also 6 months old as of today.
It's a great read, but it is neither an indicator of average performance (i.e. select scenes were chosen for demonstration purposes) nor is it performed by an independent entity like a reviewer or an user.

The games are already out there, are they not? Where are the results showing an average 10-14% performance boost for VRS Quality and Balanced on Gears Tactics, Gears 5 and Doom Eternal?

I am not saying you havent read it, but i see questions poping in this thread that are already addressed in this article. The results are in the article and in the case of doom eternal we got it from the horse mouth.
 
Where are the results showing an average 10-14% performance boost for VRS Quality and Balanced on Gears Tactics, Gears 5 and Doom Eternal?
Run the test yourself on a tier capable PC in Tactics or Gears 5? But honestly there is no reason to as VRS savings are gonna be self evident needing scarecly any testing based upon what the coalition already showed off (Insane settings on a 6900xt if I recall).
 
Run the test yourself on a tier capable PC in Tactics or Gears 5?
I would if I could :neutral:
Sell me a RTX3080 or a RX 6800 XT @ MSRP and we have a deal!

But hoesntly there is no reason to as VRS savings are gonna be self evident needing scarecly any testing based upon what the coalition already showed off (Insane settings on a 6900xt if I recall).
Has anyone been able to replicate Coalition's 14% results for average frametimes or framerates?
 
and hitman 3, metro EE afair but those are VRS T1
and CIV T2
https://devblogs.microsoft.com/directx/variable-rate-shading-a-scalpel-in-a-world-of-sledgehammers/

same number 14% is also mentioned here

" For the same scene, Firaxis saw a 14% increase in FPS with their screenspace image implementation."

That's Civilization 6 which I play almost every week with friends, and at least for me there's no VRS toggle in the game's settings. Was that ever implemented in the actual game, outside of that demo from 2019?
Even just running DX12 in that game is a matter of whether or not their latest launcher screwed compatibility with the API.


I've played that game in my Ice Lake laptop with a 64 EU Gen11 GPU. Like I said above, the 14% boost in that iGPU would have been more believable IMO and that tech would be welcomed in my case.
 
This is still going on? :-|

We know vrs can produce big performance gains on some games because we know some games are pixel shader bound. Doesn't take a million super precise tests.
 
A million? We don't need a million precise tests.

But can we get say 4 (four) examples of "big performance gains" that aren't self-reports from developers on selected scenes?
I mean actual "VRS Off vs. VRS On" tests with average framerate and/or frametime results from test runs.

It's not that hard to achieve and I'd do it myself had I been able to upgrade my GPU this year.
Neither my laptop's GTX 1050 Ti nor my desktop's Vega 64 will be able to do it. My work subnotebook does run VRS Tier 1 on an Intel Gen11 iGPU, but the reported 14% gains aren't coming from Tier 1 VRS IIRC.



Stop making and presenting false statements. You're the only person ever talking about VRS as a silver bullet or miraculous tech. Everyone except for you has a reasonable view of it.
Is he though? Have you seen the post above declaring "vrs can produce big performance gains"?
It's definitely not the only post I've seen with such claims in this thread.
 
So VRS tier 2 isnt all that then the resolution gap between X and PS5 in Doom Eternal is mostly due to power differences.

But we have word straight from the lead engine programmer than VRS tier 2 is responsible for 10-15% resolution boost.
 
So VRS tier 2 isnt all that then the resolution gap between X and PS5 in Doom Eternal is mostly due to power differences.

But we have word straight from the lead engine programmer than VRS tier 2 is responsible for 10-15% resolution boost.

This is totally unfair, but sometimes I get the feeling that a Developer stating that they get X% performance gain from VRS isn't considered valid since it isn't coming from a Sony developer. :p

Instead, proof is going through a level and looking for a location where it provides the least benefit possible. Let's just ignore that the performance uplift won't be uniform across a game's various levels due to how taxing that particular section is combined with how much VRS is used in that particular section. IE - it'll be more or less depending on scene.

If VRS is used instead of more aggressive DRS, then you should be looking at the most taxing section of a level to see what benefits VRS might provide. And even then it may not be telling the whole tale.

Time and time again we see that it's difficult to isolate [1] particular rendering effect, tool, optimization, etc. in benchmarks because any given scene uses many many different rendering effects, tools, optimizations, etc. But still some persist in trying to say that X performance gain or loss in any given scene is down to just [1] thing. :p

Regards,
SB
 
Last edited:
Is he though? Have you seen the post above declaring "vrs can produce big performance gains"?
It's definitely not the only post I've seen with such claims in this thread.

How on earth is that an unreasonable view? You can turn your whole screen to 4x4 vrs and any part of your pipeline that scales by reoslution will run about 4x faster. It would look bad, of course, because it's be running at 1/4th the res! The idea that rendering less can make your game perform faster is the kind of common sense that should absolutely not be up for debate.

What's worth discussing is how practical that hypothetical is for modern engines, how bad of visual compromises they have to make, etc, which I've talked about in other posts.

Not even gonna touch the 'graphics programmers are lying, only enthusiasts know how to measure performance' part of your post.
 
Status
Not open for further replies.
Back
Top