Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

I've tried the mod and can confirm a decent performance uplift on my 3060ti.

Game at 1440p with DLSS on quality, all RT settings on and everything else on Ultra+ (GPU load is 100% during all scenes)

In the first little village you go in to
  • No mod: 38fps
  • With mod: 44fps

Overlooking a field by the windmill
  • No mod: 43fps
  • With mod: 53fps

Autosave with a guy fixing a wagon at the start of the game (Overlooking water and trees)
  • No mod: 35fps
  • With mod: 43fps

I've not tried dropping DLSS to balanced mode yet but it should get me quite close to a locked 60fps across the whole game now.
 

Interesting pixel fillrate measurement Xbox Series X is just behind Ps5
It also shows how limited by its bandwidth was the Pro: 3 vs 13.1, the gap (4.3x) is much bigger than the theoretical gap (2.25x). The PS5 clearly doesn't seem to be much limited by its bandwidth here. Maybe helped by extended delta color compression of RDNA.
i think that was know the PS5 has slightly higher fillrate due to clocks.
Sure but we didn't have any benchmarks to actually show it. Or did we?
 
It also shows how limited by its bandwidth was the Pro: 3 vs 13.1, the gap (4.3x) is much bigger than the theoretical gap (2.25x). The PS5 clearly doesn't seem to be much limited by its bandwidth here. Maybe helped by extended delta color compression of RDNA.

Sure but we didn't have any benchmarks to actually show it. Or did we?
No this is the first benchmark
 
When you do that, the game will just use the system dll. Which works with more overlay tool that the included one btw, but performances are the same here.


Interesting pixel fillrate measurement Xbox Series X is just behind Ps5

RX6600xt/6700 would actually be above the 2080, guess clocks matter for this psrticular benchmark. Gpus are perfectly where they should be in this one then.
 
RX6600xt/6700 would actually be above the 2080, guess clocks matter for this psrticular benchmark. Gpus are perfectly where they should be in this one then.

Interesting how far behind my 1070 is. Obviously this is just measuring back end performance so won't be a perfect reflection of real world game performance but it does help illuminate why I struggle to reach the resolutions of the new consoles.

And those new gen GPU's are just monsters.
 
Was expecting a larger gap between ps5 and Xbox.
Presumably this test isn't just testing fillrate though. How do the ratios compare between other GPUs? Are any in keeping wiht the exact theoretical differences? If so, the delta between PS5 and XBSX does indeed point to something. Otherwise that console relationship isn't revealed in this benchmark.
 
Was expecting a larger gap between ps5 and Xbox. The theoretical difference in fill rate tests should be greater than 20%. From one perspective ps5 could be bandwidth limited, from another perspective perhaps XSX isn’t as hamstrung as thought.


The gap is very low, this a good point for Xbox Series X and the test is made for benchmark raw fillrate.


Presumably this test isn't just testing fillrate though. How do the ratios compare between other GPUs? Are any in keeping wiht the exact theoretical differences? If so, the delta between PS5 and XBSX does indeed point to something. Otherwise that console relationship isn't revealed in this benchmark.
This is not a test of the tools but a very specific test about raw fillrate.
 
Presumably this test isn't just testing fillrate though. How do the ratios compare between other GPUs? Are any in keeping wiht the exact theoretical differences? If so, the delta between PS5 and XBSX does indeed point to something. Otherwise that console relationship isn't revealed in this benchmark.
Yea this one is interesting to say the least. But the tweet says that this is a very specific test designed to measure the raw fill rate of the shader.
I'm not sure how close this shader should come to theoretical limits, but this in some ways may explain some behaviour.
The primary function of Slug is to take a Unicode string (encoded as UTF-8), lay out the corresponding glyphs, and generate a vertex buffer containing the data needed to draw them. When text is rendered, your application binds the vertex buffer, one of our glyph shaders, and two texture maps associated with the font. One texture holds all of the Bézier curve data, and the other texture holds spatial data structures that Slug uses for efficient rendering.

Typically we look at benchmarks and make a reasonable assumption that game performance and benchmark performance are not the same, and that game performance is usually lower than benchmark performance.
In some ways I was just expecting this relationship to hold, and very much expecting PS5 to be much further ahead with 2x the number of ROPS (vs double pumping) and 20% more clock rate.

Game performance, we do see there are times where during alpha both PS5 and XSX struggle, and there are also moments where just XSX struggles. But it's clear that they can both struggle. I suppose in many ways this benchmark shows how close they are. Perhaps when they both struggle it's a bandwidth limit, and when just XSX struggles, it's a ROP limit.
 
This is not a test of the tools but a very specific test about raw fillrate.
Achieved in the shader. How much is influenced by GPU width? Just reading tweet replies...


Suggests wider XBSX GPU helps reduce gap somehow.

As I say, how do the other GPUs compare, like 4080 vs 3070, or 1070 vs 1060? Do they all scale in relation to theoretical fillrates or not?

My quick Googlage,

1070 = 107.7 Gpx/s
1060 = 82 Gpx/s

1070 is 1.3x bigger. But in benchmark it's 1.6x bigger.
 
Interesting how far behind my 1070 is. Obviously this is just measuring back end performance so won't be a perfect reflection of real world game performance but it does help illuminate why I struggle to reach the resolutions of the new consoles.

And those new gen GPU's are just monsters.

Pascal is a 2016 GPU arch..... Its practically a GPU just 2.5 years after the PS4 so yea, its old. That and the 1070 was quite abit below the 1080/1080Ti, these latter have some more capabilities.
Shows how fast tech has evolved still, a 6600XT would net you ballpark console performance.

That chart just shows how they perform in that rather simplistic test.

A game will be a completely different set of results .

We would assume anyone on this forum be well aware of that ;)
 
Yea this one is interesting to say the least. But the tweet says that this is a very specific test designed to measure the raw fill rate of the shader.
I'm not sure how close this shader should come to theoretical limits, but this in some ways may explain some behaviour.


Typically we look at benchmarks and make a reasonable assumption that game performance and benchmark performance are not the same, and that game performance is usually lower than benchmark performance.
In some ways I was just expecting this relationship to hold, and very much expecting PS5 to be much further ahead with 2x the number of ROPS (vs double pumping) and 20% more clock rate.

Game performance, we do see there are times where during alpha both PS5 and XSX struggle, and there are also moments where just XSX struggles. But it's clear that they can both struggle. I suppose in many ways this benchmark shows how close they are. Perhaps when they both struggle it's a bandwidth limit, and when just XSX struggles, it's a ROP limit.
I don't think this benchmark is designed to use the depth ROPs. It's probably mainly using the color ROPs which both consoles have the same amount.
 
Back
Top