Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
By that argument, 12TF, 560GB/s, 16GB/s memory can only do as well as that 1070 as well.
None of it is sensible when you compare things to PC, especially if you aren't doing serious investigation and benchmarking. You're just shooting in the dark ultimately.

I don't know what else to say. There's only so many factors that can be attributed to performance.
ALU, Frequency, Bandwidth and Storage.

Outside of that is just poor programming or poor performance from drivers.
nah, just not so faster to put 2.25x more pixels than 1440p and targetting 4k
You're just shooting in the dark ultimately.
wrote person that theorized that tony hawk is bandwith killer in 1440p ;)
 
nah, just not so faster to put 2.25x more pixels than 1440p and targetting 4k

wrote person that theorized that tony hawk is bandwith killer in 1440p ;)
120fps for the consoles is the bandwidth killer. Not Tony hawk, please go back and read.

You have 0 insight into how much the CPU is there picking up the load for rendering. One of those processors is over 1000 CAD easily. The Xeon competes in the price bracket of a 3950x. And the 3600X still vastly outperforms whats on console. Without insight into how much the CPU is assisting with things, like culling etc. You don't know how much bandwidth it can take up. Sorry, if this is an insufficient answer, but ALU, clockspeed, and storage are all covered by the consoles. From a hardware perspective that only leaves bandwidth. Outside of that you have drivers;which once again, can be heavily offset by by CPU performance on PC. Leaving only poor coding for the game.

You're entire platform of your argument is: PC with CPU that is more expensive than my entire console with a weaker GPU should not be able to outperform my console with a weaker CPU, shared memory, and a strong GPU.
 
A true next-gen experience is coming to the PlayStation 5 and Xbox Series X|S featuring improved resolution, spatial audio, high-fidelity atmospherics, and more.

Skate in super crisp 120 FPS at 1080P, or 60 FPS in native 4K*. Watch the levels come to life like never before with sharper dynamic shadows, reflections, and lens flare, plus enhanced skater textures and more.

What is high-fidelity atmospherics? All of this sounds like marketing buzzwords for simple enhancements already found in the PC version.
 
120fps for the consoles is the bandwidth killer. Not Tony hawk, please go back and read.

You have 0 insight into how much the CPU is there picking up the load for rendering. One of those processors is over 1000 CAD easily. The Xeon competes in the price bracket of a 3950x. And the 3600X still vastly outperforms whats on console. Without insight into how much the CPU is assisting with things, like culling etc. You don't know how much bandwidth it can take up. Sorry, if this is an insufficient answer, but ALU, clockspeed, and storage are all covered by the consoles. From a hardware perspective that only leaves bandwidth. Outside of that you have drivers;which once again, can be heavily offset by by CPU performance on PC. Leaving only poor coding for the game.

You're entire platform of your argument is: PC with CPU that is more expensive than my entire console with a weaker GPU should not be able to outperform my console with a weaker CPU, shared memory, and a strong GPU.
ok but now you are talking about cpu (in other benchamrk 3600 + weak 1660super was almost enough for 120fps btw) and I just wanted to point out that whole you bandwidth theory is not accurate in tony hawk scope as this game is not bandwidth demanding at all even for 1440@120
 
Around 5700xt performance when not bw bottlenecked isnt all that powerfull of a gpu these days anyways. Consessions will have to be made here and there, like Control ps5 being pretty much equal to low/med on pc.
 
ok but now you are talking about cpu (in other benchamrk 3600 + weak 1660super was almost enough for 120fps btw) and I just wanted to point out that whole you bandwidth theory is not accurate in tony hawk scope as this game is not bandwidth demanding at all even for 1440@120
To prove your point you need to baseline a lot of things then you are currently
a) consoles may not be runnign the same code base as PC.
b) if they are running a completely inefficient code base compared to PC, they still hit hardware bottlenecks.
c) Is there even a DX12 version? What if they are running DX11 on Xbox and GNM on PS5, how do we know the impacts of that on the consoles as the frame rates scale up?

I'm not saying programming, apis and drivers can't be the issue. They most certainly ARE the issue, and a good rewrite of the entire game is likely needed to improve things.
But poor coding will hit one of those limits as well right? Just because I coded poorly doesn't necessarily mean I didn't hit hardware bottlenecks.

So the only thing that makes sense, is for it to hit a bandwidth limit. You can choose how many processors and CPUs you use. You can choose your graphics settings. But the programmers can't choose the frequency, and they can't choose how much ALU the work is divided over. It's clearly not a game that will break 16GB of storage.
There's not much else to explain why it would not be obtaining a higher resolution except for a bandwidth bottleneck or the GPU is just sitting around idle way more than it should be.

if your argument is that you think they could do way better than this, the answer is yes. Undeniably. It's incredibly poorly optimized.
 
Would go for the xsx (and pc) too.
The discussion point for me was about why the XSX and PS5 couldn't go higher moving from 60 to 120fps.

Looking at PC (nvidia) benchmarks, there is enough for both consoles to hit 4K@120 nearly - possibly. Unfortunately the ones I've seen from AMD don't fare as well.
But I'll just leave it at that.

I just wanted to explain why there was such a drop off from 4K60 to 1080@120. And in general this particular behaviour we also see happen on XSX. But in this case, the only reason I think XSX holds a bit better is because it has a 120GB more bandwidth to hold onto.

If you assume the codebase is the same between the console settings (60 and 120), and they are just reducing resolution to improve frame rate, one of those things will need to bottleneck (compute or fillrate all of which need to be fed by bandwidth - which both suffer by sharing with the CPU)...

The other alternative is that they are using DX11 and GNM, and it's just a really slow API that both GPUs are just sitting around dicking about. I thought about that too, but so do these PC benchmarks that are breaking 120+

So that doesn't leave me with much honestly. The code is not optimal in taking advantage of these consoles, but in selecting where that code is likely struggling, I'd have to say bandwidth.

It's also possible they are running higher graphical settings than found in PC, I don't know. We will need to wait for reviews to showcase everything.

The thing is with PC reporting on frame rates. There's not a lot out there.
This guy from PC gamer:
Graphically, nothing here is phenomenal but it's a strong remaster of the old levels' basic geometry. Vehicles and skaters have far more detail, there's no obvious pop-in, and the lighting and texture work have many standout moments. My Nvidia RTX 2070 ran the game at 1080p on automatic max settings just fine for the most part, though did stutter a little on the longer, downhill-style levels, so some tweaking will likely be required if you want it to be perfectly smooth.

I honestly don't know how accurate some of those FPS reporting tools are on youtube. I don't know, we'd need a proper investigation here. I'm just calling out what I think the biggest barriers are for consoles to increase frame rate to 120fps. That's bandwidth.
 
Last edited:
Power usage of the PS5 while in the 1080p@120hz mode should settle this.

I picked up the game and it's bringing back memories from 20+ years ago while at uni, nothing about it strikes me as pushing the CPU hard. Beyond anything happening in the background the game would be running largely the same processes as the PS1. *ahem*

It does strike me as inconsistent and a bit odd that there's such a massive resolution differential in this game. I can't see how bandwidth or the power shifting of the PS5 could be causing such a difference.

There may be a good reason for it, but I'm unconvinced by the current suggested reasons. I've got several 120hz games for the PS5 (COD, Fortnite, WRC9, Destiny 2, and Pro Skater 1 + 2). This game is the simplest of the lot.
 
Power usage of the PS5 while in the 1080p@120hz mode should settle this.

I picked up the game and it's bringing back memories from 20+ years ago while at uni, nothing about it strikes me as pushing the CPU hard. Beyond anything happening in the background the game would be running largely the same processes as the PS1. *ahem*

It does strike me as inconsistent and a bit odd that there's such a massive resolution differential in this game. I can't see how bandwidth or the power shifting of the PS5 could be causing such a difference.

There may be a good reason for it, but I'm unconvinced by the current suggested reasons. I've got several 120hz games for the PS5 (COD, Fortnite, WRC9, Destiny 2, and Pro Skater 1 + 2). This game is the simplest of the lot.
Of course. PS5 has actually the advantage in COD and Destiny 2 (and DMC5) at 120hz. And those games are much more demanding than this one and use a modern engine with bells and whislte and all. Duh. Sometimes the simplest explanation is the right one. PS5 can't output 1440p at 120hz, hence it dosn't have this mode. This would not be the first time a game (notably cross-gen) game don't run at 120hz on PS5 because of lack of PS4 BC 120hz, for instance. and devs just don't care about the outcome.
 
There may be a good reason for it, but I'm unconvinced by the current suggested reasons. I've got several 120hz games for the PS5 (COD, Fortnite, WRC9, Destiny 2, and Pro Skater 1 + 2). This game is the simplest of the lot.

The only like for like comparison there is fortnite (on the same engine) and fn is clearly light on very expensive stuff like volumetric for exactly this reason. Racing games have totallly different requirements, destiny 2 is an old game, and cod is a super optimized cutting edge clustered forward renderer with very different tradeoffs to ue4.
 
Of course. PS5 has actually the advantage in COD and Destiny 2 (and DMC5) at 120hz. And those games are much more demanding than this one and use a modern engine with bells and whislte and all. Duh. Sometimes the simplest explanation is the right one. PS5 can't output 1440p at 120hz, hence it dosn't have this mode. This would not be the first time a game (notably cross-gen) game don't run at 120hz on PS5 because of lack of PS4 BC 120hz, for instance. and devs just don't care about the outcome.
You're hilarious.
 
Yeah, I'm not buying that it's not rendering at 1440p because the PS5 doesn't output at 1440p.

It's surely not that either, the development studio certainly knows the difference between render and output resolutions.
 
The studio might, but apparently not the posters here.

Half tempted to pick up a power monitor and test the game - it'd be interesting to see the power requirements of the 4k@60hz mode and compare.

If DigitalFoundry did it for this game's faceoff then it'd save me a bit of money.
 
Half tempted to pick up a power monitor and test the game - it'd be interesting to see the power requirements of the 4k@60hz mode and compare.

If DigitalFoundry did it for this game's faceoff then it'd save me a bit of money.

I (and a few others I assume) would like to see power draw numbers plotted over time along with the captured playback video. The trouble is there really isn't a good and reasonably priced power meter with logging capabilities.
 
Somehow this didn't show up when I was looking back in November.

Might have found the least evil hack of dealing with this, finding what used to be rather expensive for cheap on eBay or other auction places for $35 - $50, the "Watts Up? .Net" power meter. It has a USB connection and can store 12000 sample points. If it has a Clear command, you can sync it at the launch of a game then download the data at the end of the run.

Here's a blog about it and an Amazon review on different model:
https://myrandomtechblog.com/cryptomining/watts-up/
https://www.amazon.com/gp/customer-...ef=cm_cr_dp_d_rvw_ttl?ie=UTF8&ASIN=B000CSWW92
 
The question is what does that have to do with not rendering at 1440p120.
I was specifically answering @Karamazov 's post. Sorry if it missed some additional context that I apparently didn't follow.


Are people playing Avengers? It seemed to me the game was going the way of Anthem after it bombed hard in both user and review scores half a year ago.
At some point Crystal Dynamics will need to relocate their resources on to more profitable ventures, and maybe that point has been passed already.
 
Status
Not open for further replies.
Back
Top