Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
When a framerate drops on both systems it'd be interesting to capture the resolutions over the course of that single second. In order to see total pixels rendered.

But really it might be overkill especially when both consoles are performing remarkably similar.
pixel counting is incredibly difficult as you have to find aliased lines of a specific length to perform it. I'm not sure what the chances are of finding aliased frames at those specific moments. Reconstruction ala Temporal upsampling make this much harder unfortunately. If it was a simple upsample I think resolution counting would be straight forward and automated. But the move towards temporal makes this incredibly challenging and thus we have no automated counters today.
 
Well it was a realistic probable cause. And anyways, it still can't output 1440p120.
It's one of the silliest excuses I've ever read. There are already 1440p rendered games on PS4 pro and PS5. Not to mention dynamic resolution games, with all the weird rendering resolutions involved, wouldn't be possible.
 
yea. mmm.

You could try just hammering the share button and taking photo stills at points you think the resolution will drop. Outside of that I'm not sure. You may want to ask @VGA himself.
you could try that, yes. But on the other hand, you never now how "cheaty" those screenshot implementation are. It is always the best way, not to inform the system that you want to make a screenshot if you want to make a fair comparison. Wouldn't be the first time "bullshots" are presented instead of screenshots (that has nothing to do with sony or ms, just that some game developers implement better screenshots e.g. like a photo-mode).
 
So this is interesting:


Apparently the PS5 does render at 1440p in 120hz mode. According to this the consoles have variable resolutions in both modes and the PS5 has a higher low. Weirdly inconsistent with the press released resolutions.

Xbox Series X Fidelity Mode: 3264x1836 to 3840x2160
Xbox Series X 120hz Mode: 1920x1080 to 2560x1440

PS5 Fidelity Mode: 3552x1998 to 3840x2160
PS5 120hz Mode: 2112x1188 to 2560x1440
To clarify these are only the lowest resolutions that they have found. They only test so many frames.
 
When a framerate drops on both systems it'd be interesting to capture the resolutions over the course of that single second. In order to see total pixels rendered.

But really it might be overkill especially when both consoles are performing remarkably similar.

Yes, as I think I mentioned before, I think most of the gen9 SeriesX vs PS5 comparisons seem boring, with NXGamer, DF, ElAnalysta, etc. spending hours trying to see two pixels with the same color value just to distinguish between e.g. 2160p and 2040p+upscaling at some given screenshots made up of 8 million pixels.

I guess it's the result of the Gen9s still mostly running Gen8 code and bruteforcing their way into those. Giving it a few years down the road and we'll probably see a bit more differentiation (e.g. slightly higher geometry on PS5, slightly richer shader effects on SeriesX) that will make the work less boring to the people doing the comparisons.
 
pixel counting is incredibly difficult as you have to find aliased lines of a specific length to perform it. I'm not sure what the chances are of finding aliased frames at those specific moments. Reconstruction ala Temporal upsampling make this much harder unfortunately. If it was a simple upsample I think resolution counting would be straight forward and automated. But the move towards temporal makes this incredibly challenging and thus we have no automated counters today.
Honestly, I love DF and many of their competitors, but all of the pixel counting today is based on limited samples and worst cases. Watch any DF video and they will probably mention that they have to wait for camera cuts or other special circumstances to find raw pixel edges. These situations are usually where temporal AA solutions have failed, so these pixel counts aren't representative of the quality of the image in any other case.

This isn't to say that the numbers aren't useful information. It's a useful comparison point for games with similar code base and feature sets, but there are still situations where a raw pixel count may be lower or higher than a perception would lead you to believe. There are some games that use lower resolution alpha effects, or volumetrics, or reflections for performance. I can't remember the game I played last year but it used quarter res alpha on tree branches, and you would be wondering through these wooded areas and the resolution of the trees against the sky looked terrible because it was essentially 540p, even though the game was outputting 1080p.

Also, we had this conversation about resolution in 2015. I was having a bit of deja vu and reread that thread. It was muddy back then, it's worse now.
 
So this is interesting:


Apparently the PS5 does render at 1440p in 120hz mode. According to this the consoles have variable resolutions in both modes and the PS5 has a higher low. Weirdly inconsistent with the press released resolutions.

Xbox Series X Fidelity Mode: 3264x1836 to 3840x2160
Xbox Series X 120hz Mode: 1920x1080 to 2560x1440

PS5 Fidelity Mode: 3552x1998 to 3840x2160
PS5 120hz Mode: 2112x1188 to 2560x1440

In hindsight the official tweet made no sense as most players would be playing in 4k120 if they want the performance mode.

Interesting this is the first fps win for the X, even if the difference is basically nothing.

Overall essentially parity between the 5 and the X.
 
Last edited:
I bought this ps5 update (10 dolars) for tony hawk and have to say its first ps5 game (other I checked was wrc9 and dirt5) that 120 fps vs 60 fps graphics is very comparable but you see fluid of 120fps in camera movement so its the mode to play. Good update (tough usage of dualsense adaptive trigger for revert is debatable)
 
In hindsight the official tweet made no sense as most players would be playing in 4k120 if they want the performance mode.

Interesting this is the first fps win for the X, even if the difference is basically nothing.

Overall essentially parity between the 5 and the X.
Yes, there seem to be a DRS with a bit more aggressive settings on XSX this time (contrary to some previous comparisons). Min res is actually higher on PS5. But hard to judge the actual winner by those differences. I think the game performs virtually the same in both machines here.
 
Just catching up here now. So, a tweet mentioning that the PS5 version of Tony Hawk performance mode was supposedly only 1080p/120hz, caused all the fanboy ruckus on other forums? Was this tweet even verified?
 
It was more than that, it was done in artwork style from the game and by the official game twitter account. It got the full workup.



Oh, I see. So, either Activision/Vicarious accidentally miscommunicated something, or Sony's check finally cleared. Seriously, this had to be a mistake to begin with. 4K/60fps would be far more GPU intensive than 1440p/120fps. Oh well...
 
Oh, I see. So, either Activision/Vicarious accidentally miscommunicated something, or Sony's check finally cleared. Seriously, this had to be a mistake, to begin with. 4K/60fps would be far more GPU intensive than 1440p/120fps. Oh well...
still seems on the low side of things if I'm being honest.
 
What is? Were you expecting 1600p-1800p @120hz?
If you look at PC benchmarks for a 5700XT, it's running north of 100@4K.
The consoles seem cluster around the 1440p mark for 120fps.
Its range is between 80-120 @ 4K.
80 for heavy effects though.

I don't mind that the consoles are running below a 5700XT. It's just something I've noticed that with very high frame rates, both consoles can't seem to keep up with their PC equivalent counterparts. The discussion around bandwidth was around that, because clearly both GPUs on the consoles from a TF perspective can outperform the 5700XT. If the ALU aren't being fed then... they're just sitting idle. Which I suspect is happening here for console.
 
Last edited:
If you look at PC benchmarks for a 5700XT, it's running north of 100@4K.
The consoles seem cluster around the 1440p mark for 120fps.

pc-master-race.jpg
:yes::mrgreen::p:love:

Seriously, maybe having the console editions locked/VSYNC at typical consumer TV resolutions and refresh-rates (4K/60) was more important, and 1440p/120hz for those consumer TV-sets that support it.
 
Last edited:
Status
Not open for further replies.
Back
Top