Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
Regarding the NXGamer video, I’m surprised PS5 does so well...again, MS have been going on about their low latency and the DS5 packing more tech might imply a white-wash.

But maybe that’s just my poor thinking on how these things work!?
As per the results reported by NXGamer listed below by function, the data is unfortunately inconclusive. The measurement will always be a mixture of game engine + drivers + the controller itself. Assuming the sdk/driver + controller is where latency can be resolved by the platform holders, the engine code itself cannot. With a variety of systems that can now decouple from framerate, without knowing how often the engine is polling for controller input there is no consistent way to determine the driver and controller portion of the latency. Comparing the same game across platforms is unfortunately not a useful comparison, it runs on the assumption that polling rates, game code updates, CPU speeds are all consistent factors between the two which they may not be (and are likely not to be) even if the end result is somewhat similar.

I also thought it was interesting that NX Gamer also says that the PS5 just about pips the Series X, when I think his results show - if anything - that it's not possible to determine in any meaningful way that either platform is inherently better for next gen game input latency.

AC: Valhalla
60 fps mode: XSX 6 ms faster
30 fps mode: XSX 38 ms faster (u wot m8?)

Watchdogs Legion
30 fps: XSX 10 ms faster

Dirt 5
120 fps mode: draw
60 fps mode: PS5 13 ms faster (XSX laggier than X1S == PS4)


COD: Cold War
120 fps mode: PS5 4 ms faster
60 fps mode: PS5 6 ms faster

Removing the obviously game related figures from Dirt 5 60 fps mode (XSX platform is laggier than X1S == PS5 == PS4), and also the single biggest delta of all in AC: Valhalla 30 fps mode (PS5 loses an additional 32 ms dropping to 30 fps mode, basically an entire additional frame over XSX), you end up with really small differences that can go either way.

Based on this (limited) data it's not possible to say that either platform has an advantage for next gen games. Both are similarly low when you remove obvious outliers like AC:V 30 fps on PS5.
 
NXGamer is in Nexusgamer territory imo. They both make comparisons with adjustment to ones likings.
I'm not particularly questioning the results that NXG posted, the results are what he measured.
I just wanted to address the question of how much of those results can be separated into MS lying about it's work on latency or not. In which the results posted by NXG would be inconclusive in answering, the reality is, we couldn't possibly know how much latency MS removed by looking at numbers like this.

To figure out the latency, you'd need the SDK kit and just measure click to print out on console with nothing in between. That should give you an idea of the work MS did in reducing latency.

When it comes to looking at the whole thing, controller input processing is trickier than just taking in inputs. There could be a slew of calculations during input processing for 1 game like Dirt 5, that would be dramatically different from how another game AC Valhalla would do it. And this also makes measurement harder, because it may be in the interest of the game engine to not act on an action immediately. It may be checking to see if other animations are closed off first, or if the character is back ot it's natural state to perform an animation. The controller input may have been registered, but it may not act on it until a frame later for instance.

For driving games, you need to be careful about over doing the amount of steering for instance when someone is pressing down on a controller, so perhaps you wait for 2 inputs to ensure that's what the driver wanted. Or how much steering is actually applied may be accelerated as there are more consecutive frame inputs, such that a single frame input is very minor for instance (and also very difficult for a camera to capture when performing these tests). We know controllers are not finely tuned input devices like wheels and pedals which have much more sensitivity.

In fighting games, where players can queue up button inputs for combination attacks, those are pretty much as close to raw as you can get. You can step into any trainer and see all inputs are captured and none are ignored.

So how the developer wants to process inputs will matter, and it may change between platform as well.

Hope that helps.
 
In fighting games, where players can queue up button inputs for combination attacks, those are pretty much as close to raw as you can get. You can step into any trainer and see all inputs are captured and none are ignored.

Its also wireless, wonder how much things would improve if one could have controllers to be wired (for competitions, they do those with tekken right?). Also wonder how much latency we really have on pc with kb/m in competitive FPS games like CS, CoD etc.
Guess all this isnt all that important to most, i have CoD on PS5 and even with kb/m there its not going to break casual deathmatches. Sims could be another thing (arma, dcs etc.
 
We are getting input lag seen in NES and SNES games (those running at 60fps) using a CRT monitor (40ms).

We're getting closer for sure, but a NES / MS / SNES / MD game could have lag of as little as 16 ms! That's to say ~16 ms (or potentially even less but probably not) after the pad was sampled you could be beginning to draw the frame on the CRT (so roughly 16 ~ 33 ms for the duration of scanning the frame on the CRT screen).

NX Gamer is removing display lag from his figures to try and determine what the system latency is, and COD is getting down to roughly 33 ms, or four frames, at 120 fps. Probably another 15 ~ 150 ms for display lag on top, depending on the tv and its display mode. Something like that.

So very impressive absolutely, but still a little way off a fast old school game, on a scanline sprite renderer with direct output (no frame buffers!), on a CRT.

I suppose this would be time for me to whine again about me throwing out all my CRTs years ago. What a clown.
 
Its also wireless, wonder how much things would improve if one could have controllers to be wired (for competitions, they do those with tekken right?). Also wonder how much latency we really have on pc with kb/m in competitive FPS games like CS, CoD etc.
Guess all this isnt all that important to most, i have CoD on PS5 and even with kb/m there its not going to break casual deathmatches. Sims could be another thing (arma, dcs etc.
You can determine latency on the mouse by adjusting your polling rates and device drivers. There are mouse programs specifically that look to adjusting these parameters for faster and/or smoother response.

something like this:
https://www.overclock.net/threads/d...ow-1000hz-mouse-driver.1597441/#post_26247598

would allow you to poll at 1000fps. My razer also comes with polling sliders. The higher the rate the more pressure on your CPU though.

wrt your question, on PC it's as much as you can handle I guess.
 
The versions tested were 1.000.003 on PS5 and 3.0.0.3 on Xbox Series X|S.

Timestamps:
0:00 - PS5
3:20 - Xbox Series X
6:40 - Xbox Series S

The stage at 2:19 was the only stage found where the frame rate could drop below 60fps on PS5.

Krypt mode runs at 30fps http://bit.ly/35iIIqa

PS5 uses a dynamic resolution with the highest resolution found being 3840x2160 and the lowest resolution found being approximately 3328x1872.

Xbox Series X uses a dynamic resolution with the highest resolution found being 3840x2160 and the lowest resolution found being approximately 3584x2016.

The resolution seems to very rarely drop below 3840x2160 on PS5 and Xbox Series X. The only resolution found on PS5 and Xbox Series X during the scenes capped at 30fps was 3840x2160.

During gameplay the Xbox Series S uses a dynamic resolution with the highest resolution found being approximately 2688x1512 and the lowest resolution found being 1920x1080. Pixel counts at 1920x1080 are rare on Xbox Series S.

During scenes capped at 30fps the Xbox Series S uses a dynamic resolution with the highest resolution found being approximately 3456x1944 and the lowest resolution found being 2560x1440.

Stats: http://bit.ly/2XlIVEJ
Frames Pixel Counted: http://bit.ly/3nnwVx5

S75NDwn.jpg
 
What's the reason for Series S running a resolution cap of 2688x1512?
It would seem to be completely pointless.

A better looking image than if it had targeted 1080p/1440p only?
 
I'm not particularly questioning the results that NXG posted, the results are what he measured.
I just wanted to address the question of how much of those results can be separated into MS lying about it's work on latency or not. In which the results posted by NXG would be inconclusive in answering, the reality is, we couldn't possibly know how much latency MS removed by looking at numbers like this.

To figure out the latency, you'd need the SDK kit and just measure click to print out on console with nothing in between. That should give you an idea of the work MS did in reducing latency.

When it comes to looking at the whole thing, controller input processing is trickier than just taking in inputs. There could be a slew of calculations during input processing for 1 game like Dirt 5, that would be dramatically different from how another game AC Valhalla would do it. And this also makes measurement harder, because it may be in the interest of the game engine to not act on an action immediately. It may be checking to see if other animations are closed off first, or if the character is back ot it's natural state to perform an animation. The controller input may have been registered, but it may not act on it until a frame later for instance.

For driving games, you need to be careful about over doing the amount of steering for instance when someone is pressing down on a controller, so perhaps you wait for 2 inputs to ensure that's what the driver wanted. Or how much steering is actually applied may be accelerated as there are more consecutive frame inputs, such that a single frame input is very minor for instance (and also very difficult for a camera to capture when performing these tests). We know controllers are not finely tuned input devices like wheels and pedals which have much more sensitivity.

In fighting games, where players can queue up button inputs for combination attacks, those are pretty much as close to raw as you can get. You can step into any trainer and see all inputs are captured and none are ignored.

So how the developer wants to process inputs will matter, and it may change between platform as well.

Hope that helps.
Just to clarify, I wasn’t implying MS lied - just that they talked about it a lot.

At the end of the day the only ‘fair’ comparison will be like for like, those are the games where you might be at a disadvantage playing online against someone...so a game like CoD is very important, massive seller on both systems and playing against each other.
 
Just to clarify, I wasn’t implying MS lied - just that they talked about it a lot.

At the end of the day the only ‘fair’ comparison will be like for like, those are the games where you might be at a disadvantage playing online against someone...so a game like CoD is very important, massive seller on both systems and playing against each other.
no worries; I apologize, I didn't mean to take your words out of context - but I guess it just came down to whether MS were overselling the concept of how much they worked on latency. I get the question, the numbers are fine by NXG, we just have no way to answer your specific question with how he measured it.
 
What's the reason for Series S running a resolution cap of 2688x1512?
It would seem to be completely pointless.

Once you're rescaling your image either up to 4K or down to 1080 the only real rule is that higher is better. Outside of 1440p displays there's no real reason that 1440p has any more 'point' or legitimacy than 1512p - it's simply lower and so produces a less detailed image.
 
What's the reason for Series S running a resolution cap of 2688x1512?
It would seem to be completely pointless.

I agree 2.1x native is weird -- surely 2x (1440p) would scale better and make a more stable image? Anyone know if theres a detail to how the downscaling filter works that explains why this could be?
 
I agree 2.1x native is weird -- surely 2x (1440p) would scale better and make a more stable image? Anyone know if theres a detail to how the downscaling filter works that explains why this could be?

1440p is neither 2x 1080p (by number of pixels) or 50% of 4k - varying fractions of pixels in all directions are going to be involved in any up or down scale as you create the final image - same as for 1512p (or whatever).

The prevalence of dynamic res games these days demonstrates that there is no special none-native resolution that games should aim to hit. Higher is basically better.

(I suppose an exception might be if you liked the sharpest possible image, and wanted to do an integer 2 x 2 upscale e.g. 1080p to 2160p, but this would rapidly look lower detail than upscaling from e.g. 1200p or 1500p etc).
 
What's the reason for Series S running a resolution cap of 2688x1512?
It would seem to be completely pointless.
Well, I guess the game just uses a real dynamic resolution (how it should be). If I would guess, I would guess it would go up to 4k as long as the hardware can deliver. The whole nature of dynamic resolutions is to always max out the hardware and don't sacrifice framerate.

So if MS delivers a faster Series S (e.g. in the next iteration of consoles) the game might still use the "S" path (if there is one) and might just give us a 4k image. Maybe the game just don't use complete different paths for the S and X but just scales via resolution and available memory (features are the same and CPU almost). This would make sense for future generations.


edit:
btw, those numbers also speak for my theory that the resolution is just capped at 4k on all platforms
Xbox Series S (30fps mode)
Highest: 3456x1944
Lowest: 2560x1440

In the 30fps cinematics the resolution goes up a little bit more. So with an hypothetical x1s2 it might end up at native 4k.
 
Last edited:
Well, I guess the game just uses a real dynamic resolution (how it should be). If I would guess, I would guess it would go up to 4k as long as the hardware can deliver. The whole nature of dynamic resolutions is to always max out the hardware and don't sacrifice framerate.

So if MS delivers a faster Series S (e.g. in the next iteration of consoles) the game might still use the "S" path (if there is one) and might just give us a 4k image. Maybe the game just don't use complete different paths for the S and X but just scales via resolution and available memory (features are the same and CPU almost). This would make sense for future generations.


edit:
btw, those numbers also speak for my theory that the resolution is just capped at 4k on all platforms
Xbox Series S (30fps mode)
Highest: 3456x1944
Lowest: 2560x1440

In the 30fps cinematics the resolution goes up a little bit more. So with an hypothetical x1s2 it might end up at native 4k.

Thank you. That's a much more logical explanation than the others provided.
 
As you can see in the video, the changes are mainly in the resolution and framerate. PS5 now renders at the maximum resolution that PS4 Pro reached (1200p) at a pretty solid 60fps. There are no more noticeable changes.

- Series S has increased framerate to 60 (previously 45)
- The resolution of Series S is dynamic, between 900p ~ 1080p, staying more in 1080p.
- Shadows are better on Series X compared to PS5 and Series S.
- Series X in resolution mode has a dynamic resolution ranging from 1512p~2160p with 4K upscaling. In FPS mode, the resolution is between 1080p~1440p.
- PS5 only has one display mode. It has increased its framerate to 60 (previously 45) and its resolution stands at 1200p with rescaling to 1440p.


Per EA.
Some of you may have noticed a new title update being downloaded for Star Wars Jedi: Fallen Order™. Below you will find the release notes for this update, which improves backwards compatibility performance specifically on the latest generation of consoles.

High Level Summary of Features:
  • Improved framerate on Xbox Series X|S and PlayStation 5
  • Improved dynamic resolution ranges, for a higher resolution experience on Xbox Series X|S and PlayStation 5
  • Improved post-processing resolution for Xbox Series X and PS5. (Not Xbox Series S)

Console Specifics:


Xbox Series S
  • Framerate has been increased to 60 FPS (up from 45 FPS)
Xbox Series X Performance mode
  • Framerate has been increased to 60 FPS
  • Dynamic resolution added in the range of 1080p to 1440p
Xbox Series X Normal mode (non-performance mode)
  • Postprocessing has been increased to 4K
  • Dynamic resolution in the range of 1512p to 2160p
PlayStation 5
  • Framerate has been increased to 60 FPS (up from 45 FPS)
  • Postprocessing increased to 1440p
  • Dynamic resolution has been disabled and the game is rendering at 1200p (up from 810-1080p)

So no, there is no dynamic 4K/30fps mode (or non-performance mode) for PS5, as the PS4 Pro code didn't support it.
 
Last edited:
Status
Not open for further replies.
Back
Top