Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

Achieved in the shader. How much is influenced by GPU width? Just reading tweet replies...


Suggests wider XBSX GPU helps reduce gap somehow.

As I say, how do the other GPUs compare, like 4080 vs 3070, or 1070 vs 1060? Do they all scale in relation to theoretical fillrates or not?

My quick Googlage,

1070 = 107.7 Gpx/s
1060 = 82 Gpx/s

1070 is 1.3x bigger. But in benchmark it's 1.6x bigger.
Interesting that you chose the 1070 for the comparison as it was always a bit odd fillrate-wise. I'd be curious to see what the 1080 scores, given that both have 64 ROPs and similar clocks.

I'm trying to remember back to exactly which situations the 1070 could flex all 64 ROPs. MSAA resolve was one, which is why I bought it for my first VR card; VR games are very frequently MSAA-heavy and in those, the 1070 was nearly as good as the 1080.


"As for GTX 1070, things are a bit different. The card has all of the ROPs of GTX 1080 and 80% of the memory bandwidth, however what it doesn’t have is GP104’s 4th GPC. Home of the Raster Engine responsible for rasterization, GTX 1070 can only setup 48 pixels/clock to begin with, despite the fact that the ROPs can accept 64 pixels. As a result it takes a significant hit here, delivering 77% of GTX 1080’s pixel throughput. With all of that said, the fact that in-game performance is closer than this is a reminder to the fact that while pixel throughput is an important part of game performance, it’s often not the bottleneck."
 
I don't think this benchmark is designed to use the depth ROPs. It's probably mainly using the color ROPs which both consoles have the same amount.
Or…
PS5 is way ahead on fill rate, and XSX is just keeping up on the computation delta.
 
Interesting that you chose the 1070 for the comparison as it was always a bit odd fillrate-wise. I'd be curious to see what the 1080 scores, given that both have 64 ROPs and similar clocks.
Interesting? Like selective comparison to favour an argument? I banged in two GPUs that have similar part numbers and I know are somewhat related! 🤣I don't know these GPUs.

I had invited other more knowledgable people to make the comparison to answer my question (and the wider question of what factors may be influencing results) for me. As no-one stepped up to the plate, I quickly did a search for one example. I don't even know if those number are accurate on that website.

Anyone wanting to correct me and present better data-analysis and fairer comparison to reach an answer is welcome. Indeed someone who knows these GPUs should do! By all means make a fairer comparison using other GPUs. Seeing as I did such a shit job, why don't you spend less time pointing out the 1070 is an 'interesting' choice, pick a few GPU-comparisons and make us a little table of theoretical fill-rate datas and glyph benchmark deltas?

We still have people positing theories comparing XBXS to PS5 without even considering the other numerous references available. Why use science when just imagining things is so much quicker and easier?¯\_(ツ)_/¯

* rant caveat - this assumes your choice of 'interesting' was accurate to intention. It's possible you meant somthing like "unfortunate" or "well, I woudln't have picked 1070" or similar.
 
I think he chose interesting as unfortunate. I don’t think he was malicious in his comment there. I wouldn’t haven’t known that difference off the top of my head so I gave him a like for pointing it out.
But in another B3D timeline where we had more than just a like button, but like others like insightful, or helpful etc. I would have chosen a different response.
 
I think he chose interesting as unfortunate. I don’t think he was malicious in his comment there. I wouldn’t haven’t known that difference off the top of my head so I gave him a like for pointing it out.
But in another B3D timeline where we had more than just a like button, but like others like insightful, or helpful etc. I would have chosen a different response.

It happens quite easily to suspect something whenever someone makes a comparison between platforms, just because you know what platform preferences a user has.

I noticed the 1070 comparison too, however i took a careful approach just saying ’there was better stuff like the 1080 to compare with’. Instead of ’that was an intresting choice as comparison’ and that resulting in a potentional warning, accusation etc.
Now im sure that T2098 had zero bad intentions, its just to be aware of these things to avoid confusion.
If you talk pc/pc discussions its less likely to result in such as happened here, but when different platforms/ihv’s are involved, becarefull.
 
Interesting that you chose the 1070 for the comparison as it was always a bit odd fillrate-wise. I'd be curious to see what the 1080 scores, given that both have 64 ROPs and similar clocks.

I'm trying to remember back to exactly which situations the 1070 could flex all 64 ROPs. MSAA resolve was one, which is why I bought it for my first VR card; VR games are very frequently MSAA-heavy and in those, the 1070 was nearly as good as the 1080.


"As for GTX 1070, things are a bit different. The card has all of the ROPs of GTX 1080 and 80% of the memory bandwidth, however what it doesn’t have is GP104’s 4th GPC. Home of the Raster Engine responsible for rasterization, GTX 1070 can only setup 48 pixels/clock to begin with, despite the fact that the ROPs can accept 64 pixels. As a result it takes a significant hit here, delivering 77% of GTX 1080’s pixel throughput. With all of that said, the fact that in-game performance is closer than this is a reminder to the fact that while pixel throughput is an important part of game performance, it’s often not the bottleneck."

He does the benchmark with the GPU he possess... Some could criticize the fact he used a 4080 and not a 4090 for example but he can't have all GPU even if he has many GPU because of his job.
 
Last edited:
I came across this article and got a bit annoyed..


I don't think that two games of varying circumstances last year is somehow irrefutable proof the consoles are weak 2 years in.

On the contrary, I think this year we will really start to see what they can do as devs drop the previous gen more and more.

Also, games still have to run on series S, which will keep PS5 and series x ambition targets grounded which means in theory that they should never feel long in the tooth, atleast not until they are close to being retired in favor of next generation
 
I don't think that two games of varying circumstances last year is somehow irrefutable proof the consoles are weak 2 years in.

They are weak, they're comparable to 4 year old mid-range PC GPU's and we are already seeing that in games in relation to what you get on PC.

An £400 RTX3060ti alone is ~30% faster in raster and depending on the game, up to twice as fast in ray tracing.
 
I feel an issue with this console generation is in terms of expectations. Asking for 30 fps -> 60 fps and 1080p -> 4k alone already requires a significant increase in hardware resources. There is also in general a diminishing rate of return with respect to the actual perceived increase in visual fidelity/scope with respect to how much hardware resources is available. Not to mention the increase in development complexity as well. Flexibility in FPS and resolution off the pseudo 4k60 target will simply alleviate a lot of pressure in terms of the resources needed to deliver increased fidelity and scope.
 
They are weak, they're comparable to 4 year old mid-range PC GPU's and we are already seeing that in games in relation to what you get on PC.

An £400 RTX3060ti alone is ~30% faster in raster and depending on the game, up to twice as fast in ray tracing.
I didn't even talk about comparisons with PC to begin with so I don't know why your bringing PC up.

They are a huge jump from last gen and on their own are fine machines. All im saying is, we haven't even begun to see what can they can do with the crossgen period so prominent so basing their performance profile on 2 games, one of which features significant issues even on PC is silly.

Unless your claiming games made to run on machines only a small fraction of the power is representative of what Xbox series and ps5 are capable of. I guess PS3 games running on pS4 at higher resolution would also count in your book
 
I feel an issue with this console generation is in terms of expectations. Asking for 30 fps -> 60 fps and 1080p -> 4k alone already requires a significant increase in hardware resources. There is also in general a diminishing rate of return with respect to the actual perceived increase in visual fidelity/scope with respect to how much hardware resources is available. Not to mention the increase in development complexity as well. Flexibility in FPS and resolution off the pseudo 4k60 target will simply alleviate a lot of pressure in terms of the resources needed to deliver increased fidelity and scope.
Devs on console already don't care about 4k with the advent of dynamic res. The hw will run what the game can tolerate in terms of resolution. And even more so with FSR2.1/TSR and other capable smart upscalers.

60fps may not be doable for every game. But I think most games will be able to aim for that and still be generally impressive
 
I didn't even talk about comparisons with PC to begin with so I don't know why your bringing PC up.
1. Because I can
2. PC is a competitor of console
They are a huge jump from last gen and on their own are fine machines.
Last gen were also using old mid-range PC GPU's too.
All im saying is, we haven't even begun to see what can they can do with the crossgen period so prominent so basing their performance profile on 2 games, one of which features significant issues even on PC is silly.
No it's not.

Performance does not equal graphics.
Unless your claiming games made to run on machines only a small fraction of the power is representative of what Xbox series and ps5 are capable of.
The performance they display in those game is what they're capable of, because if they could do more they would offer more.

30/40fps at native 4k in Forbidden West is all PS5 can do, the next Horizon game likely won't even have a native 4k mode and will heavily require upscaling tech at 30fps to offer a big visual jump.
I guess PS3 games running on pS4 at higher resolution would also count in your book

Yes they do.

We are also slowly getting to the point where a more more powerful PC isn't that much more expensive than a disk edition PS5 now so their value for money as gaming devices is also dropping.
 
Last edited:
1. Because I can
2. PC is a competitor of console

Last gen were also using old mid-range PC GPU's too.

No it's not.

Performance does not equal graphics.

The performance they display in those game is what they're capable of, because if they could do more they would offer more.

30fps at native 4k in Forbidden West is all PS5 can do, the next Horizon game likely won't even have a native 4k mode and will heavily require upscaling tech at 30fps to offer a big visual jump.


Yes they do.
Didnt Forbidden west also offer a 40fps mode?
 
DF and just about any reviewer gauges console performance using pc hw/gpus. Are they powerfull? Yes compared to last generation they sure are, however the jump aint that big (DFs take) expecting 4k, 60fps and that large jump in fidelity wont happen. RT would impact that even more.

Its basically rx6600xt and an zen2 tied to a shared total 16gb of ram. Optimizations only get you so far.
 
1. Because I can
2. PC is a competitor of console

I'm not sure why your bringing up PC even now when I am speaking in a vacuum about what the consoles can do. And I don't know what market competition has to do with what I'm talking about when I am referring to what developers can bring out of the platform

Last gen were also using old mid-range PC GPU's too.
Your premise is wrong as the cpus in the current consoles are much greater than the jaguars CPU and devs will be able to likewise take advantage of them to make better games. What relation they have to PC components as i said before is irrelevant to the post your responding to
No it's not.

Performance does not equal graphics.
I wasn't just referring to graphics either. But what developers will be able to do at 60fps when fully dedicating their time to the new machines. Games at 60fps with scale scope and detail all at the same time were rare on last gen and now they don't have to be, even if it still requires compromises.

The performance they display in those game is what they're capable of, because if they could do more they would offer more.
Even Alex has said Gotham knights is not representative of any hardware. And has even said that 60fps is something that may be possible with Plague tale with reordered settings and a lower resolution targets.
30fps at native 4k in Forbidden West is all PS5 can do, the next Horizon game likely won't even have a native 4k mode and will heavily require upscaling tech at 30fps to offer a big visual jump.
And I'm saying what PS5 and series x can do with dev teams dedicated to squeezing everything out of the hw isn't mutually exclusive from what they have to do on work flow optimization side like dynamic res or various fps options and various cut backs. Depending on what the dev wants to do. I'm not sure why you think native 4k is something I think is important or even apart of what I'm saying.
Yes they do.

We are also slowly getting to the point where a more more powerful PC isn't that much more expensive than a disk edition PS5 now so their value for money as gaming devices is also dropping.
Your obsession with PC vs console is strange. And your needlessly agressive over what should be an pretty obvious statement of fact that devs will be able to get a lot out of the current hw like they always do.

I don't think I want to speak with you anymore.
 
DF and just about any reviewer gauges console performance using pc hw/gpus. Are they powerfull? Yes compared to last generation they sure are, however the jump aint that big (DFs take) expecting 4k, 60fps and that large jump in fidelity wont happen. RT would impact that even more.

Its basically rx6600xt and an zen2 tied to a shared total 16gb of ram. Optimizations only get you so far.
4k isn't apart of my calculation. (Resolution is dead)

And I'm not even neccisarily talking about 60fps either although I think as usual the fps target will come down to the devs priorities and goals for their individual project.

All I am saying is the PS5 and Xbox consoles still have a lot of offer and like usual with console gens devs will push them for years to come and get good things out of them.

How is that not reasonable of a take
 
Back
Top