Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
we're still a very long way from seeing VRS implemented correctly at least in tandem with where it's supposed to be. It may be too early to judge it's performance in it's entirety of course.
But obviously native vs VRS, native is better.
Higher resolution with VRS vs lower resolution native, you're going to find it's going to be in favour of VRS at least from my analysis of it, but not necessarily a perfect win.
But how much performance gain you get from VRS is dependent on the situation. Unfortunately it won't be consistent because looking at a dark area is going to net massive gains for instance, and in bright sparkly areas perhaps not.

So it would be best to indicate a performance range for VRS.

Teaser: Green square XSX higher IQ. Blue Square PS5 higher IQ.
XSX holds resolution advantage here.

My full analysis is on the subscriber forum, I've still got a lot of formatting and sourcing to put in so I've held back to releasing to public for more feedback. But it's there now if you want to provide some feedback.
fft-cmp-ps5-xsx-rt-s10.png


I also think I found the gif that Global cropped. It is coloured blue tiles which is a relief because all of this is done through spectrum analysis, and at least there is solid indication of working. Those images are the same internal rendering resolution though, so PS5 is coming out on top in that particular comparison.

Hopefully this style of analysis removes cherry picking.
 
Last edited:
This is totally unfair, but sometimes I get the feeling that a Developer stating that they get X% performance gain from VRS isn't considered valid since it isn't coming from a Sony developer. :p

Instead, proof is going through a level and looking for a location where it provides the least benefit possible. Let's just ignore that the performance uplift won't be uniform across a game's various levels due to how taxing that particular section is combined with how much VRS is used in that particular section.

If VRS is used instead of more aggressive DRS, then you should be looking at the most taxing section of a level to see what benefits VRS might provide. And even then it may not be telling the whole tale.

Time and time again we see that it's difficult to isolate [1] particular rendering effect, tool, optimization, etc. in benchmarks because any given scene uses many many different rendering effects, tools, optimizations, etc. But still some persist in trying to say that X performance gain or loss in any given scene is down to just [1] thing. :p

Regards,
SB

Of course word from the team itself needs to be taken seriously, but isn't it a natural task of the reviewer or tester to check the claims out?
GPU reviews or platform comparisons are typically trying damn hard to have 100% identical scenarios and then picking hard on the differences they spot, and the reviewer proceeds to give his opinion on whether compromise X was worth it or not. You see them provide a variety of scenes for their comparisons.
Yet the actual comparison images for VRS vs native being linked here earlier are being accepted without even comparing the same scene.
 
Of course word from the team itself needs to be taken seriously, but isn't it a natural task of the reviewer or tester to check the claims out?
GPU reviews or platform comparisons are typically trying damn hard to have 100% identical scenarios and then picking hard on the differences they spot, and the reviewer proceeds to give his opinion on whether compromise X was worth it or not. You see them provide a variety of scenes for their comparisons.

Sure, but since I started benchmarking and reading reviews (going on over 30 years now), time and again it's been shown that benchmarking performance varies greatly by level, area, and scene. Even within a given scene performance can vary greatly depending on where the camera is pointed.

Unscrupulous reviewers can and have cherry picked scenes to benchmark to drive a narrative they are in favor of. And even for reviewers that attempt to be as impartial as possible, it's quite literally not feasible to benchmark the entirety of a game, and thus there's is still a chance, albeit low, that what they choose to benchmark isn't representative of the game as a whole. And even when it's "generally" representative, there will still be portions of the game where it isn't representative.

So, considering how complex rendering a single frame is, when combined with the complexity of finding a suitable scene to benchmark, how do you isolate the performance impact of [1] rendering effect from the multitude that are in use at any given moment?

Even given hardware that supports all rendering effects used in a given scene, performance will vary based on the design of the hardware itself. On PC, you can attempt to isolate this by enabling and disabling a given effect. However, even then it isn't uncommon that enabling or disabling a rendering effect, alters more than just that one effect. And even then was that scene representative of the minimum, maximum, or average performance impact of that effect? We could know definitively if we benched the entire game with it on and then again with it off. But that's prohibitively time consuming. Instead, a reviewer/benchmarker chooses a scene (or a tiny sampling of scenes) where they "think" X rendering effect is used a lot, but this is prone to error.

A good developer OTOH has profiling tools with which to look at any given scene and has been doing so for months and years, so they're quite aware of where X rendering effect will or won't provide a lot of benefit. On top of that they'll be far more intimately familiar with the performance impact of a given rendering effect across the scope of the game.

This gets more complex if you then try to determine the performance impact of this on 2 different consoles. Is the difference due to VRS on/off? How much of the difference is due to VRS on or off? How much of the difference is due to the hardware differences?

Regards,
SB
 
A million? We don't need a million precise tests.

But can we get say 4 (four) examples of "big performance gains" that aren't self-reports from developers on selected scenes?
I mean actual "VRS Off vs. VRS On" tests with average framerate and/or frametime results from test runs.

It's not that hard to achieve and I'd do it myself had I been able to upgrade my GPU this year.
Neither my laptop's GTX 1050 Ti nor my desktop's Vega 64 will be able to do it. My work subnotebook does run VRS Tier 1 on an Intel Gen11 iGPU, but the reported 14% gains aren't coming from Tier 1 VRS IIRC.




Is he though? Have you seen the post above declaring "vrs can produce big performance gains"?
It's definitely not the only post I've seen with such claims in this thread.

vrs the new softram

https://en.wikipedia.org/wiki/SoftRAM
 
This is totally unfair, but sometimes I get the feeling that a Developer stating that they get X% performance gain from VRS isn't considered valid since it isn't coming from a Sony developer.

1 - The Developer you mention claims up to X% performance gain, followed by an example. Nowhere did the Coalition claim a 14% gain on average.

2 - Had it come from a Sony developer the obvious approach should still be "trust but verify". Apparently in what relates to performance gains everyone's trusting and no one's verifying.
 
VGtech reports XSX res advantage in all modes as well. What’s interesting is that at those times XSX is capped to max res it will go even higher if it wasn’t capped in those modes

E6Di4GiWQAYbR4v
interesting that lowest difference at rt mode (14.8% more pixels on xsx using vrs)
 
Could you show us your results "after normalizing", as well as your system specs?
The other user results I'm seeing on the Internet point to the same negligible performance differences I'm seeing in your screenshots.
I know people are going to want exact numbers, but, sorry, I have a lot on my plate today. But I do remember that the OFF was basically 100 fps (would flutter between 99 and 100) and VRS performance was 108-110. The other modes basically scaled linearly. 103ish for quality, 106ish for balanced. Honestly, I think to get real performance numbers I'd have to play the game or run the benchmark, and I just don't have time for that. Standing in a hallway and changing settings only tells you performance for that one scene, and it's possible I'm limited by something other than shading performance.

Hardware wise I have a Ryzen 5 2600X, 32GB RAM, and an RTX 2070 Super (8GB of VRAM). Nothing overclocked, except I guess the GPU, which comes with a factory overclock (1785mhz vs 1650 shock I think).
 
Gears 5 VRS benchmark results. Every setting at maximum. Only variables changed were render scale (for resolution) and VRS
Code:
1080p
Off          67.1
Quality      69.6
Balanced     69.4
Performance  70.7

1440p
Off           48.5
Quality       51.3
Balanced      52.0
Performance   54.3

2160p
Off           25.7
Quality       28.3
Balanced      28.6
Performance   30.05
Random note, at 4k scaled back to 1080p (I only have a 1080p monitor) performance mode doesn't look bad. Which makes sense, since you are blending screen data while scaling down. But you get the better edge quality from the higher res framebuffer.

Also, going from OFF to QUALITY just seams to add 2-3 FPS regardless of resolution. So not a flat percentage

Going from QUALITY to BALANCED gives you very little extra performance, but I did notice GPU utilization was slightly lower at 1080p. However, so was the framerate. Performance mode... Well if there existed a GPU that had tons of fill rate but less compute, that would be awesome I think. Otherwise, the quality drop off is too noticeable.
 
So from your results, using Balanced:

1080p: 3.4% boost
1440p: 7.2% boost
2160p: 11.3% boost

Using Quality:
1080p: 3.7% boost
1440p: 5.7% boost
2160p: 9.9% boost


As expected, the boosted difference gets higher the higher render resolution it's being used. For 1440p and below the difference is really small, though.
I can see why this could make a substantial difference in the average resolution of a DRS renderer. Though I think the games targeting native 4K are scarce and about to get scarcer still, due to DLSS, FSR, TAAU, etc.


Going from QUALITY to BALANCED gives you very little extra performance, but I did notice GPU utilization was slightly lower at 1080p. However, so was the framerate. Performance mode... Well if there existed a GPU that had tons of fill rate but less compute, that would be awesome I think. Otherwise, the quality drop off is too noticeable.
Navi 23 with 28/32CUs and a whopping 64 ROPs comes to mind.
It will be nice if we can find VRS Off vs. VRS Performance comparisons on one of those GPUs.
 
Gears 5 VRS benchmark results. Every setting at maximum. Only variables changed were render scale (for resolution) and VRS
Code:
1080p
Off          67.1
Quality      69.6
Balanced     69.4
Performance  70.7

1440p
Off           48.5
Quality       51.3
Balanced      52.0
Performance   54.3

2160p
Off           25.7
Quality       28.3
Balanced      28.6
Performance   30.05
Random note, at 4k scaled back to 1080p (I only have a 1080p monitor) performance mode doesn't look bad. Which makes sense, since you are blending screen data while scaling down. But you get the better edge quality from the higher res framebuffer.

Also, going from OFF to QUALITY just seams to add 2-3 FPS regardless of resolution. So not a flat percentage

Going from QUALITY to BALANCED gives you very little extra performance, but I did notice GPU utilization was slightly lower at 1080p. However, so was the framerate. Performance mode... Well if there existed a GPU that had tons of fill rate but less compute, that would be awesome I think. Otherwise, the quality drop off is too noticeable.

Not like you'd want to run those benches again, but it would have been interesting to see the average minimum framerate as well as the CPU/GPU split for those cases. I'd imagine it would be pretty similar, but there might be some interesting things there potentially. The savings in rendering time from VRS could be getting used in other parts of the rendering pipeline. Unfortunately, while the Gears 5 benchmark gives us significantly more information that other build in benchmarks in games, the information isn't fine grained enough to determine that.

As well, it should be noted that some amount of rendering time will be used for scaling the out back down to 1080p from 4k. Overall, however, it looks to be about what would be expected.

Thanks for going through all that effort. :)

Regards,
SB
 
I noticed in DOOM they keep reflections in the RT path, but it looks like it's SSR only (with some weird reprojection)

doom_rtd9jbd.png

doom_rt2yhjoc.png


Typical SSR artifacts in the second screenshot. Excuse the crushed blacks as this is a screencap from an HDR screenshot mapped back to SDR lol

Maybe it's just SSR and I'm confused :LOL:
 
Have to also consider that Nvidia VRS tier 2 or even AMD VRS tier 2 might not be as performant as Series VRS Tier 2 due to hw implementation differences, or low level access differences. Didn't MS say their Series VRS is beyond tier 2?
 
Status
Not open for further replies.
Back
Top